[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15794 1726882602.12472: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-4FB executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15794 1726882602.12829: Added group all to inventory 15794 1726882602.12831: Added group ungrouped to inventory 15794 1726882602.12836: Group all now contains ungrouped 15794 1726882602.12839: Examining possible inventory source: /tmp/network-lQx/inventory.yml 15794 1726882602.25033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15794 1726882602.25088: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15794 1726882602.25108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15794 1726882602.25158: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15794 1726882602.25220: Loaded config def from plugin (inventory/script) 15794 1726882602.25221: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15794 1726882602.25256: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15794 1726882602.25329: Loaded config def from plugin (inventory/yaml) 15794 1726882602.25331: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15794 1726882602.25404: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15794 1726882602.25760: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15794 1726882602.25763: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15794 1726882602.25765: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15794 1726882602.25770: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15794 1726882602.25774: Loading data from /tmp/network-lQx/inventory.yml 15794 1726882602.25831: /tmp/network-lQx/inventory.yml was not parsable by auto 15794 1726882602.25886: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15794 1726882602.25920: Loading data from /tmp/network-lQx/inventory.yml 15794 1726882602.25989: group all already in inventory 15794 1726882602.25995: set inventory_file for managed_node1 15794 1726882602.25998: set inventory_dir for managed_node1 15794 1726882602.25999: Added host managed_node1 to inventory 15794 1726882602.26001: Added host managed_node1 to group all 15794 1726882602.26002: set ansible_host for managed_node1 15794 1726882602.26003: set ansible_ssh_extra_args for managed_node1 15794 1726882602.26005: set inventory_file for managed_node2 15794 1726882602.26007: set inventory_dir for managed_node2 15794 1726882602.26008: Added host managed_node2 to inventory 15794 1726882602.26009: Added host managed_node2 to group all 15794 1726882602.26009: set ansible_host for managed_node2 15794 1726882602.26010: set ansible_ssh_extra_args for managed_node2 15794 1726882602.26012: set inventory_file for managed_node3 15794 1726882602.26014: set inventory_dir for managed_node3 15794 1726882602.26014: Added host managed_node3 to inventory 15794 1726882602.26015: Added host managed_node3 to group all 15794 1726882602.26016: set ansible_host for managed_node3 15794 1726882602.26016: set ansible_ssh_extra_args for managed_node3 15794 1726882602.26018: Reconcile groups and hosts in inventory. 15794 1726882602.26021: Group ungrouped now contains managed_node1 15794 1726882602.26023: Group ungrouped now contains managed_node2 15794 1726882602.26024: Group ungrouped now contains managed_node3 15794 1726882602.26092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15794 1726882602.26195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15794 1726882602.26235: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15794 1726882602.26261: Loaded config def from plugin (vars/host_group_vars) 15794 1726882602.26263: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15794 1726882602.26270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15794 1726882602.26276: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15794 1726882602.26312: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15794 1726882602.26578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882602.26655: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15794 1726882602.26689: Loaded config def from plugin (connection/local) 15794 1726882602.26691: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15794 1726882602.27209: Loaded config def from plugin (connection/paramiko_ssh) 15794 1726882602.27212: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15794 1726882602.27937: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15794 1726882602.27969: Loaded config def from plugin (connection/psrp) 15794 1726882602.27971: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15794 1726882602.28569: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15794 1726882602.28601: Loaded config def from plugin (connection/ssh) 15794 1726882602.28604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15794 1726882602.30180: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15794 1726882602.30211: Loaded config def from plugin (connection/winrm) 15794 1726882602.30213: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15794 1726882602.30239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15794 1726882602.30293: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15794 1726882602.30349: Loaded config def from plugin (shell/cmd) 15794 1726882602.30350: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15794 1726882602.30370: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15794 1726882602.30426: Loaded config def from plugin (shell/powershell) 15794 1726882602.30428: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15794 1726882602.30472: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15794 1726882602.30623: Loaded config def from plugin (shell/sh) 15794 1726882602.30625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15794 1726882602.30654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15794 1726882602.30757: Loaded config def from plugin (become/runas) 15794 1726882602.30759: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15794 1726882602.30917: Loaded config def from plugin (become/su) 15794 1726882602.30919: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15794 1726882602.31056: Loaded config def from plugin (become/sudo) 15794 1726882602.31058: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15794 1726882602.31085: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 15794 1726882602.31351: in VariableManager get_vars() 15794 1726882602.31369: done with get_vars() 15794 1726882602.31474: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15794 1726882602.33771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15794 1726882602.33866: in VariableManager get_vars() 15794 1726882602.33870: done with get_vars() 15794 1726882602.33872: variable 'playbook_dir' from source: magic vars 15794 1726882602.33873: variable 'ansible_playbook_python' from source: magic vars 15794 1726882602.33874: variable 'ansible_config_file' from source: magic vars 15794 1726882602.33875: variable 'groups' from source: magic vars 15794 1726882602.33876: variable 'omit' from source: magic vars 15794 1726882602.33877: variable 'ansible_version' from source: magic vars 15794 1726882602.33877: variable 'ansible_check_mode' from source: magic vars 15794 1726882602.33878: variable 'ansible_diff_mode' from source: magic vars 15794 1726882602.33879: variable 'ansible_forks' from source: magic vars 15794 1726882602.33880: variable 'ansible_inventory_sources' from source: magic vars 15794 1726882602.33880: variable 'ansible_skip_tags' from source: magic vars 15794 1726882602.33881: variable 'ansible_limit' from source: magic vars 15794 1726882602.33882: variable 'ansible_run_tags' from source: magic vars 15794 1726882602.33882: variable 'ansible_verbosity' from source: magic vars 15794 1726882602.33913: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 15794 1726882602.34455: in VariableManager get_vars() 15794 1726882602.34468: done with get_vars() 15794 1726882602.34498: in VariableManager get_vars() 15794 1726882602.34514: done with get_vars() 15794 1726882602.34546: in VariableManager get_vars() 15794 1726882602.34556: done with get_vars() 15794 1726882602.34580: in VariableManager get_vars() 15794 1726882602.34588: done with get_vars() 15794 1726882602.34652: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15794 1726882602.34813: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15794 1726882602.34930: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15794 1726882602.35462: in VariableManager get_vars() 15794 1726882602.35477: done with get_vars() 15794 1726882602.35815: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15794 1726882602.35933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15794 1726882602.36964: in VariableManager get_vars() 15794 1726882602.36978: done with get_vars() 15794 1726882602.37082: in VariableManager get_vars() 15794 1726882602.37084: done with get_vars() 15794 1726882602.37086: variable 'playbook_dir' from source: magic vars 15794 1726882602.37087: variable 'ansible_playbook_python' from source: magic vars 15794 1726882602.37088: variable 'ansible_config_file' from source: magic vars 15794 1726882602.37088: variable 'groups' from source: magic vars 15794 1726882602.37089: variable 'omit' from source: magic vars 15794 1726882602.37090: variable 'ansible_version' from source: magic vars 15794 1726882602.37090: variable 'ansible_check_mode' from source: magic vars 15794 1726882602.37091: variable 'ansible_diff_mode' from source: magic vars 15794 1726882602.37091: variable 'ansible_forks' from source: magic vars 15794 1726882602.37092: variable 'ansible_inventory_sources' from source: magic vars 15794 1726882602.37092: variable 'ansible_skip_tags' from source: magic vars 15794 1726882602.37093: variable 'ansible_limit' from source: magic vars 15794 1726882602.37094: variable 'ansible_run_tags' from source: magic vars 15794 1726882602.37094: variable 'ansible_verbosity' from source: magic vars 15794 1726882602.37120: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15794 1726882602.37180: in VariableManager get_vars() 15794 1726882602.37183: done with get_vars() 15794 1726882602.37184: variable 'playbook_dir' from source: magic vars 15794 1726882602.37185: variable 'ansible_playbook_python' from source: magic vars 15794 1726882602.37185: variable 'ansible_config_file' from source: magic vars 15794 1726882602.37186: variable 'groups' from source: magic vars 15794 1726882602.37187: variable 'omit' from source: magic vars 15794 1726882602.37187: variable 'ansible_version' from source: magic vars 15794 1726882602.37188: variable 'ansible_check_mode' from source: magic vars 15794 1726882602.37189: variable 'ansible_diff_mode' from source: magic vars 15794 1726882602.37189: variable 'ansible_forks' from source: magic vars 15794 1726882602.37190: variable 'ansible_inventory_sources' from source: magic vars 15794 1726882602.37190: variable 'ansible_skip_tags' from source: magic vars 15794 1726882602.37191: variable 'ansible_limit' from source: magic vars 15794 1726882602.37191: variable 'ansible_run_tags' from source: magic vars 15794 1726882602.37192: variable 'ansible_verbosity' from source: magic vars 15794 1726882602.37217: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15794 1726882602.37283: in VariableManager get_vars() 15794 1726882602.37294: done with get_vars() 15794 1726882602.37326: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15794 1726882602.37415: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15794 1726882602.37476: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15794 1726882602.37859: in VariableManager get_vars() 15794 1726882602.37873: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15794 1726882602.39175: in VariableManager get_vars() 15794 1726882602.39192: done with get_vars() 15794 1726882602.39221: in VariableManager get_vars() 15794 1726882602.39223: done with get_vars() 15794 1726882602.39225: variable 'playbook_dir' from source: magic vars 15794 1726882602.39226: variable 'ansible_playbook_python' from source: magic vars 15794 1726882602.39226: variable 'ansible_config_file' from source: magic vars 15794 1726882602.39227: variable 'groups' from source: magic vars 15794 1726882602.39227: variable 'omit' from source: magic vars 15794 1726882602.39228: variable 'ansible_version' from source: magic vars 15794 1726882602.39229: variable 'ansible_check_mode' from source: magic vars 15794 1726882602.39230: variable 'ansible_diff_mode' from source: magic vars 15794 1726882602.39231: variable 'ansible_forks' from source: magic vars 15794 1726882602.39232: variable 'ansible_inventory_sources' from source: magic vars 15794 1726882602.39232: variable 'ansible_skip_tags' from source: magic vars 15794 1726882602.39233: variable 'ansible_limit' from source: magic vars 15794 1726882602.39235: variable 'ansible_run_tags' from source: magic vars 15794 1726882602.39235: variable 'ansible_verbosity' from source: magic vars 15794 1726882602.39260: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15794 1726882602.39315: in VariableManager get_vars() 15794 1726882602.39325: done with get_vars() 15794 1726882602.39360: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15794 1726882602.40594: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15794 1726882602.40658: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15794 1726882602.40971: in VariableManager get_vars() 15794 1726882602.40990: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15794 1726882602.42295: in VariableManager get_vars() 15794 1726882602.42306: done with get_vars() 15794 1726882602.42335: in VariableManager get_vars() 15794 1726882602.42344: done with get_vars() 15794 1726882602.42395: in VariableManager get_vars() 15794 1726882602.42404: done with get_vars() 15794 1726882602.42477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15794 1726882602.42492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15794 1726882602.42673: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15794 1726882602.42802: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15794 1726882602.42805: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 15794 1726882602.42832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15794 1726882602.42854: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15794 1726882602.42995: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15794 1726882602.43049: Loaded config def from plugin (callback/default) 15794 1726882602.43051: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15794 1726882602.43986: Loaded config def from plugin (callback/junit) 15794 1726882602.43988: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15794 1726882602.44026: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15794 1726882602.44081: Loaded config def from plugin (callback/minimal) 15794 1726882602.44083: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15794 1726882602.44116: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15794 1726882602.44168: Loaded config def from plugin (callback/tree) 15794 1726882602.44170: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15794 1726882602.44275: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15794 1726882602.44277: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 15794 1726882602.44297: in VariableManager get_vars() 15794 1726882602.44307: done with get_vars() 15794 1726882602.44311: in VariableManager get_vars() 15794 1726882602.44317: done with get_vars() 15794 1726882602.44320: variable 'omit' from source: magic vars 15794 1726882602.44350: in VariableManager get_vars() 15794 1726882602.44362: done with get_vars() 15794 1726882602.44379: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 15794 1726882602.44810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15794 1726882602.44867: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15794 1726882602.44897: getting the remaining hosts for this loop 15794 1726882602.44899: done getting the remaining hosts for this loop 15794 1726882602.44902: getting the next task for host managed_node1 15794 1726882602.44905: done getting next task for host managed_node1 15794 1726882602.44906: ^ task is: TASK: Gathering Facts 15794 1726882602.44908: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882602.44913: getting variables 15794 1726882602.44914: in VariableManager get_vars() 15794 1726882602.44921: Calling all_inventory to load vars for managed_node1 15794 1726882602.44923: Calling groups_inventory to load vars for managed_node1 15794 1726882602.44926: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882602.44938: Calling all_plugins_play to load vars for managed_node1 15794 1726882602.44947: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882602.44949: Calling groups_plugins_play to load vars for managed_node1 15794 1726882602.44975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882602.45020: done with get_vars() 15794 1726882602.45025: done getting variables 15794 1726882602.45079: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 21:36:42 -0400 (0:00:00.008) 0:00:00.008 ****** 15794 1726882602.45096: entering _queue_task() for managed_node1/gather_facts 15794 1726882602.45097: Creating lock for gather_facts 15794 1726882602.45389: worker is 1 (out of 1 available) 15794 1726882602.45402: exiting _queue_task() for managed_node1/gather_facts 15794 1726882602.45417: done queuing things up, now waiting for results queue to drain 15794 1726882602.45419: waiting for pending results... 15794 1726882602.45571: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882602.45642: in run() - task 0affe814-3a2d-94e5-e48f-00000000007c 15794 1726882602.45654: variable 'ansible_search_path' from source: unknown 15794 1726882602.45694: calling self._execute() 15794 1726882602.45743: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882602.45750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882602.45760: variable 'omit' from source: magic vars 15794 1726882602.45846: variable 'omit' from source: magic vars 15794 1726882602.45871: variable 'omit' from source: magic vars 15794 1726882602.45901: variable 'omit' from source: magic vars 15794 1726882602.45942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882602.45973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882602.45993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882602.46009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882602.46023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882602.46050: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882602.46053: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882602.46059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882602.46144: Set connection var ansible_connection to ssh 15794 1726882602.46152: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882602.46160: Set connection var ansible_pipelining to False 15794 1726882602.46166: Set connection var ansible_shell_executable to /bin/sh 15794 1726882602.46170: Set connection var ansible_shell_type to sh 15794 1726882602.46178: Set connection var ansible_timeout to 10 15794 1726882602.46205: variable 'ansible_shell_executable' from source: unknown 15794 1726882602.46209: variable 'ansible_connection' from source: unknown 15794 1726882602.46212: variable 'ansible_module_compression' from source: unknown 15794 1726882602.46215: variable 'ansible_shell_type' from source: unknown 15794 1726882602.46218: variable 'ansible_shell_executable' from source: unknown 15794 1726882602.46223: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882602.46228: variable 'ansible_pipelining' from source: unknown 15794 1726882602.46232: variable 'ansible_timeout' from source: unknown 15794 1726882602.46243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882602.46395: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882602.46404: variable 'omit' from source: magic vars 15794 1726882602.46411: starting attempt loop 15794 1726882602.46414: running the handler 15794 1726882602.46430: variable 'ansible_facts' from source: unknown 15794 1726882602.46448: _low_level_execute_command(): starting 15794 1726882602.46456: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882602.47013: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882602.47018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882602.47021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882602.47023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882602.47091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882602.47095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882602.47098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882602.47163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882602.48919: stdout chunk (state=3): >>>/root <<< 15794 1726882602.49030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882602.49086: stderr chunk (state=3): >>><<< 15794 1726882602.49089: stdout chunk (state=3): >>><<< 15794 1726882602.49109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882602.49120: _low_level_execute_command(): starting 15794 1726882602.49126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376 `" && echo ansible-tmp-1726882602.4910913-15810-192487628213376="` echo /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376 `" ) && sleep 0' 15794 1726882602.49586: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882602.49589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882602.49592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882602.49594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882602.49601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882602.49652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882602.49659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882602.49719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882602.51730: stdout chunk (state=3): >>>ansible-tmp-1726882602.4910913-15810-192487628213376=/root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376 <<< 15794 1726882602.51844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882602.51890: stderr chunk (state=3): >>><<< 15794 1726882602.51895: stdout chunk (state=3): >>><<< 15794 1726882602.51909: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882602.4910913-15810-192487628213376=/root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882602.51936: variable 'ansible_module_compression' from source: unknown 15794 1726882602.51981: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15794 1726882602.51985: ANSIBALLZ: Acquiring lock 15794 1726882602.51988: ANSIBALLZ: Lock acquired: 139758818400528 15794 1726882602.51996: ANSIBALLZ: Creating module 15794 1726882602.88262: ANSIBALLZ: Writing module into payload 15794 1726882602.88386: ANSIBALLZ: Writing module 15794 1726882602.88600: ANSIBALLZ: Renaming module 15794 1726882602.88612: ANSIBALLZ: Done creating module 15794 1726882602.88748: variable 'ansible_facts' from source: unknown 15794 1726882602.88762: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882602.88776: _low_level_execute_command(): starting 15794 1726882602.88791: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15794 1726882602.89990: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882602.90046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882602.90065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882602.90190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882602.91986: stdout chunk (state=3): >>>PLATFORM <<< 15794 1726882602.92065: stdout chunk (state=3): >>>Linux <<< 15794 1726882602.92294: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15794 1726882602.92414: stdout chunk (state=3): >>><<< 15794 1726882602.92424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882602.92463: stderr chunk (state=3): >>><<< 15794 1726882602.92508: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882602.92727 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15794 1726882602.92732: _low_level_execute_command(): starting 15794 1726882602.92737: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15794 1726882602.92873: Sending initial data 15794 1726882602.92876: Sent initial data (1181 bytes) 15794 1726882602.93342: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882602.93413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882602.93473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882602.93492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882602.93527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882602.93643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882602.97360: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 15794 1726882602.97690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882602.98040: stderr chunk (state=3): >>><<< 15794 1726882602.98044: stdout chunk (state=3): >>><<< 15794 1726882602.98047: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882602.98119: variable 'ansible_facts' from source: unknown 15794 1726882602.98129: variable 'ansible_facts' from source: unknown 15794 1726882602.98150: variable 'ansible_module_compression' from source: unknown 15794 1726882602.98256: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882602.98274: variable 'ansible_facts' from source: unknown 15794 1726882602.98448: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py 15794 1726882602.98666: Sending initial data 15794 1726882602.98670: Sent initial data (154 bytes) 15794 1726882602.99252: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882602.99350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882602.99374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882602.99393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882602.99416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882602.99505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882603.01197: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882603.01314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882603.01370: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpszohlh68 /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py <<< 15794 1726882603.01373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py" <<< 15794 1726882603.01445: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpszohlh68" to remote "/root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py" <<< 15794 1726882603.05973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882603.06015: stderr chunk (state=3): >>><<< 15794 1726882603.06215: stdout chunk (state=3): >>><<< 15794 1726882603.06219: done transferring module to remote 15794 1726882603.06221: _low_level_execute_command(): starting 15794 1726882603.06223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/ /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py && sleep 0' 15794 1726882603.07293: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882603.07355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882603.07372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882603.07404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882603.07485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882603.09583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882603.09588: stdout chunk (state=3): >>><<< 15794 1726882603.09591: stderr chunk (state=3): >>><<< 15794 1726882603.09594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882603.09596: _low_level_execute_command(): starting 15794 1726882603.09599: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/AnsiballZ_setup.py && sleep 0' 15794 1726882603.10497: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882603.10501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882603.10507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882603.10510: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882603.10513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882603.10564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882603.10577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882603.10666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882603.12881: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15794 1726882603.12904: stdout chunk (state=3): >>>import _imp # builtin <<< 15794 1726882603.12947: stdout chunk (state=3): >>>import '_thread' # <<< 15794 1726882603.12959: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 15794 1726882603.13021: stdout chunk (state=3): >>>import '_io' # <<< 15794 1726882603.13032: stdout chunk (state=3): >>>import 'marshal' # <<< 15794 1726882603.13069: stdout chunk (state=3): >>>import 'posix' # <<< 15794 1726882603.13113: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15794 1726882603.13126: stdout chunk (state=3): >>>import 'time' # <<< 15794 1726882603.13150: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 15794 1726882603.13203: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 15794 1726882603.13216: stdout chunk (state=3): >>> import '_codecs' # <<< 15794 1726882603.13250: stdout chunk (state=3): >>>import 'codecs' # <<< 15794 1726882603.13295: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15794 1726882603.13317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e232c530> <<< 15794 1726882603.13352: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e22fbb30> <<< 15794 1726882603.13365: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e232eab0> <<< 15794 1726882603.13393: stdout chunk (state=3): >>>import '_signal' # <<< 15794 1726882603.13419: stdout chunk (state=3): >>>import '_abc' # <<< 15794 1726882603.13449: stdout chunk (state=3): >>>import 'abc' # <<< 15794 1726882603.13453: stdout chunk (state=3): >>>import 'io' # <<< 15794 1726882603.13483: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15794 1726882603.13583: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15794 1726882603.13609: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15794 1726882603.13646: stdout chunk (state=3): >>>import 'os' # <<< 15794 1726882603.13669: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 15794 1726882603.13709: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 15794 1726882603.13737: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15794 1726882603.13753: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15794 1726882603.13773: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e20fd160> <<< 15794 1726882603.13843: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15794 1726882603.13846: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.13881: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e20fdfd0> <<< 15794 1726882603.13884: stdout chunk (state=3): >>>import 'site' # <<< 15794 1726882603.13920: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15794 1726882603.14326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15794 1726882603.14349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15794 1726882603.14375: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.14394: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15794 1726882603.14440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15794 1726882603.14470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15794 1726882603.14495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e213bda0> <<< 15794 1726882603.14527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15794 1726882603.14542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15794 1726882603.14572: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e213bfe0> <<< 15794 1726882603.14592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15794 1726882603.14617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15794 1726882603.14650: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15794 1726882603.14697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.14720: stdout chunk (state=3): >>>import 'itertools' # <<< 15794 1726882603.14750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 15794 1726882603.14789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21737d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15794 1726882603.14795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2173e60> <<< 15794 1726882603.14820: stdout chunk (state=3): >>>import '_collections' # <<< 15794 1726882603.14865: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2153a70> <<< 15794 1726882603.14881: stdout chunk (state=3): >>>import '_functools' # <<< 15794 1726882603.14901: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2151190> <<< 15794 1726882603.15003: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2138f50> <<< 15794 1726882603.15041: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15794 1726882603.15055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15794 1726882603.15070: stdout chunk (state=3): >>>import '_sre' # <<< 15794 1726882603.15099: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15794 1726882603.15110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15794 1726882603.15146: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15794 1726882603.15192: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21976b0> <<< 15794 1726882603.15195: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21962d0> <<< 15794 1726882603.15227: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2152030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e213ae40> <<< 15794 1726882603.15293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c8770> <<< 15794 1726882603.15308: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21381d0> <<< 15794 1726882603.15332: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15794 1726882603.15380: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.15383: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21c8c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c8ad0> <<< 15794 1726882603.15431: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.15436: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21c8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2136d20> <<< 15794 1726882603.15472: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.15498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15794 1726882603.15530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15794 1726882603.15545: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c91f0> <<< 15794 1726882603.15584: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 15794 1726882603.15607: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15794 1726882603.15631: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21ca420> import 'importlib.util' # import 'runpy' # <<< 15794 1726882603.15661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15794 1726882603.15698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15794 1726882603.15736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 15794 1726882603.15751: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e4620> import 'errno' # <<< 15794 1726882603.15779: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.15798: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21e5d60> <<< 15794 1726882603.15827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15794 1726882603.15855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15794 1726882603.15867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e6c60> <<< 15794 1726882603.15915: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21e72c0> <<< 15794 1726882603.15939: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e61b0> <<< 15794 1726882603.15958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15794 1726882603.15994: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21e7d40> <<< 15794 1726882603.16014: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e7470> <<< 15794 1726882603.16060: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21ca390> <<< 15794 1726882603.16090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15794 1726882603.16106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15794 1726882603.16145: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15794 1726882603.16158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15794 1726882603.16195: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1edfcb0> <<< 15794 1726882603.16216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15794 1726882603.16253: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.16294: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1f08770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f084d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.16298: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1f087a0> <<< 15794 1726882603.16338: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1f08980> <<< 15794 1726882603.16341: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1edde50> <<< 15794 1726882603.16369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15794 1726882603.16489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15794 1726882603.16514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15794 1726882603.16535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f0a000> <<< 15794 1726882603.16550: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f08c80> <<< 15794 1726882603.16574: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21cab40> <<< 15794 1726882603.16600: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15794 1726882603.16674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.16677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15794 1726882603.16722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15794 1726882603.16750: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f3a360> <<< 15794 1726882603.16817: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15794 1726882603.16821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.16848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15794 1726882603.16866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15794 1726882603.16917: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f52510> <<< 15794 1726882603.16935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15794 1726882603.16973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15794 1726882603.17026: stdout chunk (state=3): >>>import 'ntpath' # <<< 15794 1726882603.17150: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f8b2f0> <<< 15794 1726882603.17174: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15794 1726882603.17191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15794 1726882603.17283: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1fb1a90> <<< 15794 1726882603.17354: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f8b410> <<< 15794 1726882603.17411: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f531a0> <<< 15794 1726882603.17438: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1dd03e0> <<< 15794 1726882603.17462: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f51550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f0af00> <<< 15794 1726882603.17640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15794 1726882603.17653: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa8e1dd05c0> <<< 15794 1726882603.17835: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_445g8ccm/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 15794 1726882603.17980: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.18016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15794 1726882603.18078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15794 1726882603.18146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15794 1726882603.18184: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e36000> <<< 15794 1726882603.18206: stdout chunk (state=3): >>>import '_typing' # <<< 15794 1726882603.18399: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e0cef0> <<< 15794 1726882603.18432: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1dd3f80> # zipimport: zlib available <<< 15794 1726882603.18459: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15794 1726882603.18474: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 15794 1726882603.18504: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.20092: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.21423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e0fe60> <<< 15794 1726882603.21457: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15794 1726882603.21475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15794 1726882603.21500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15794 1726882603.21522: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15794 1726882603.21562: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1e659a0> <<< 15794 1726882603.21594: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e65730> <<< 15794 1726882603.21637: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e65040> <<< 15794 1726882603.21669: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15794 1726882603.21708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e65790> <<< 15794 1726882603.21712: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e36a20> <<< 15794 1726882603.21745: stdout chunk (state=3): >>>import 'atexit' # <<< 15794 1726882603.21749: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1e666c0> <<< 15794 1726882603.21792: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1e66840> <<< 15794 1726882603.21809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15794 1726882603.21862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15794 1726882603.21878: stdout chunk (state=3): >>>import '_locale' # <<< 15794 1726882603.21922: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e66d50> <<< 15794 1726882603.21943: stdout chunk (state=3): >>>import 'pwd' # <<< 15794 1726882603.21963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15794 1726882603.21975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15794 1726882603.22027: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cccb30> <<< 15794 1726882603.22065: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1cce7e0> <<< 15794 1726882603.22078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15794 1726882603.22090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15794 1726882603.22126: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1ccf1a0> <<< 15794 1726882603.22154: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15794 1726882603.22173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15794 1726882603.22203: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd0380> <<< 15794 1726882603.22218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15794 1726882603.22267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15794 1726882603.22288: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15794 1726882603.22346: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd2db0> <<< 15794 1726882603.22384: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1cd2ed0> <<< 15794 1726882603.22408: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd10a0> <<< 15794 1726882603.22423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15794 1726882603.22476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15794 1726882603.22479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 15794 1726882603.22505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15794 1726882603.22547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15794 1726882603.22588: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd6bd0> <<< 15794 1726882603.22603: stdout chunk (state=3): >>>import '_tokenize' # <<< 15794 1726882603.22672: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd56d0> <<< 15794 1726882603.22693: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd5430> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15794 1726882603.22782: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd7b30> <<< 15794 1726882603.22810: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd15b0> <<< 15794 1726882603.22842: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.22879: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d1ad20> <<< 15794 1726882603.22884: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 15794 1726882603.22911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15794 1726882603.22939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15794 1726882603.22957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15794 1726882603.22987: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d1ca10> <<< 15794 1726882603.23003: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1c7d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15794 1726882603.23129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15794 1726882603.23185: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.23212: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d1ef90> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1d100> <<< 15794 1726882603.23217: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15794 1726882603.23286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.23307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15794 1726882603.23310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15794 1726882603.23319: stdout chunk (state=3): >>>import '_string' # <<< 15794 1726882603.23362: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2a780> <<< 15794 1726882603.23517: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1f110> <<< 15794 1726882603.23598: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2b560> <<< 15794 1726882603.23630: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2b9b0> <<< 15794 1726882603.23693: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.23709: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2ba40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1b140> <<< 15794 1726882603.23740: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15794 1726882603.23756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15794 1726882603.23784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15794 1726882603.23808: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.23848: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2e9f0> <<< 15794 1726882603.24042: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.24047: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2fdd0> <<< 15794 1726882603.24082: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2d190> <<< 15794 1726882603.24110: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2e090> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2cd40> <<< 15794 1726882603.24136: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15794 1726882603.24153: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 15794 1726882603.24169: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.24264: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.24373: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.24403: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 15794 1726882603.24429: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15794 1726882603.24447: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.24586: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.24730: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.25456: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.26126: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15794 1726882603.26153: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15794 1726882603.26173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.26245: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.26261: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bb8080> <<< 15794 1726882603.26353: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15794 1726882603.26380: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bb9520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2e540> <<< 15794 1726882603.26437: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15794 1726882603.26497: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15794 1726882603.26501: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 15794 1726882603.26511: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.26672: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.26877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bb8d70> <<< 15794 1726882603.26911: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.27466: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28025: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28107: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28199: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 15794 1726882603.28256: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28303: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15794 1726882603.28306: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28393: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28529: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15794 1726882603.28533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15794 1726882603.28558: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15794 1726882603.28594: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28652: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15794 1726882603.28666: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.28943: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.29227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15794 1726882603.29308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15794 1726882603.29312: stdout chunk (state=3): >>>import '_ast' # <<< 15794 1726882603.29412: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bbb770> <<< 15794 1726882603.29424: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.29504: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.29590: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15794 1726882603.29603: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15794 1726882603.29632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15794 1726882603.29647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15794 1726882603.29725: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.29851: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bc1a60> <<< 15794 1726882603.29916: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bc23c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d33260> <<< 15794 1726882603.29936: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.29985: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30027: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15794 1726882603.30043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30081: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30137: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30192: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30268: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15794 1726882603.30314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.30407: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bc1100> <<< 15794 1726882603.30468: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bc2570> <<< 15794 1726882603.30497: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15794 1726882603.30584: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30643: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30677: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.30723: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.30743: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15794 1726882603.30782: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15794 1726882603.30803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15794 1726882603.30870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15794 1726882603.30884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15794 1726882603.30899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15794 1726882603.30966: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c567b0> <<< 15794 1726882603.31004: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bcc3e0> <<< 15794 1726882603.31365: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bc65a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bc63c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 15794 1726882603.31389: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31413: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31432: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31484: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31574: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15794 1726882603.31620: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31704: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31794: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31810: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.31851: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15794 1726882603.32061: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.32253: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.32300: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.32353: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 15794 1726882603.32378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15794 1726882603.32400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15794 1726882603.32418: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15794 1726882603.32456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15794 1726882603.32472: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5cd10> <<< 15794 1726882603.32503: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15794 1726882603.32513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15794 1726882603.32541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15794 1726882603.32579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15794 1726882603.32614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15794 1726882603.32617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15794 1726882603.32642: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1147d70> <<< 15794 1726882603.32676: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.32695: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1148110> <<< 15794 1726882603.32744: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bd4e30> <<< 15794 1726882603.32761: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bd4380> <<< 15794 1726882603.32811: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5eba0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5e5a0> <<< 15794 1726882603.32825: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15794 1726882603.32876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15794 1726882603.32908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 15794 1726882603.32935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 15794 1726882603.32957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15794 1726882603.32982: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.33009: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e114b170> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e114aa20> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.33032: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e114ac00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1149e50> <<< 15794 1726882603.33055: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15794 1726882603.33178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15794 1726882603.33198: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e114b290> <<< 15794 1726882603.33210: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15794 1726882603.33255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15794 1726882603.33275: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e11bddc0> <<< 15794 1726882603.33301: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e114bda0> <<< 15794 1726882603.33337: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5e5d0> import 'ansible.module_utils.facts.timeout' # <<< 15794 1726882603.33376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15794 1726882603.33399: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15794 1726882603.33412: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33471: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15794 1726882603.33548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33606: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15794 1726882603.33689: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33707: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15794 1726882603.33744: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 15794 1726882603.33789: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33841: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15794 1726882603.33905: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33945: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.33996: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15794 1726882603.34008: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.34069: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.34128: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.34194: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.34275: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15794 1726882603.34285: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15794 1726882603.34851: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 15794 1726882603.35420: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35487: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35513: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35572: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 15794 1726882603.35575: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35595: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35737: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15794 1726882603.35749: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15794 1726882603.35778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15794 1726882603.35800: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35817: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35853: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 15794 1726882603.35865: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35892: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.35925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15794 1726882603.35946: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36019: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15794 1726882603.36169: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e11befc0> <<< 15794 1726882603.36175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15794 1726882603.36210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15794 1726882603.36341: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e11be930> <<< 15794 1726882603.36355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15794 1726882603.36429: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15794 1726882603.36600: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15794 1726882603.36726: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36782: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15794 1726882603.36903: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36918: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.36967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15794 1726882603.37014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15794 1726882603.37103: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.37157: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e11ea0f0> <<< 15794 1726882603.37371: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e11daed0> import 'ansible.module_utils.facts.system.python' # <<< 15794 1726882603.37387: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.37444: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.37506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15794 1726882603.37525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.37602: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.37693: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.37815: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.37993: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15794 1726882603.38016: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38031: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38080: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15794 1726882603.38129: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38132: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15794 1726882603.38249: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882603.38284: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1005c70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1005880> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 15794 1726882603.38300: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38336: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15794 1726882603.38395: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38574: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15794 1726882603.38752: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38855: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.38971: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39020: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39082: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 15794 1726882603.39086: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39127: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39130: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39286: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39455: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15794 1726882603.39604: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15794 1726882603.39755: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.39866: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.40042: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.40461: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15794 1726882603.41059: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41168: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 15794 1726882603.41397: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15794 1726882603.41525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41685: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15794 1726882603.41882: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.41905: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 15794 1726882603.41948: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 15794 1726882603.42006: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42109: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42217: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42449: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15794 1726882603.42706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42739: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15794 1726882603.42828: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.42858: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15794 1726882603.42927: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43013: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15794 1726882603.43016: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15794 1726882603.43090: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43141: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 15794 1726882603.43217: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43281: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15794 1726882603.43356: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43649: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.43960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15794 1726882603.43964: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44021: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15794 1726882603.44113: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44131: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15794 1726882603.44199: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44240: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15794 1726882603.44295: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44299: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44348: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 15794 1726882603.44433: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15794 1726882603.44536: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44574: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15794 1726882603.44616: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 15794 1726882603.44696: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44717: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15794 1726882603.44767: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44824: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44909: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.44994: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 15794 1726882603.44998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 15794 1726882603.45025: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45060: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 15794 1726882603.45138: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45342: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15794 1726882603.45589: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45618: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45671: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15794 1726882603.45697: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45725: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15794 1726882603.45806: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45882: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.45977: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15794 1726882603.45992: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.46090: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.46191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 15794 1726882603.46204: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15794 1726882603.46282: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882603.46507: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15794 1726882603.46536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 15794 1726882603.46551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15794 1726882603.46591: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e102fb30> <<< 15794 1726882603.46603: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e102d190> <<< 15794 1726882603.46654: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e102f1d0> <<< 15794 1726882603.63781: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 15794 1726882603.63819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1074d70> <<< 15794 1726882603.63859: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 15794 1726882603.63862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 15794 1726882603.63894: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1075bb0> <<< 15794 1726882603.63946: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 15794 1726882603.63950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882603.63995: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 15794 1726882603.64024: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e10c4320> <<< 15794 1726882603.64041: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1077fb0> <<< 15794 1726882603.64388: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15794 1726882603.84780: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "43", "epoch": "1726882603", "epoch_int": "1726882603", "date": "2024-09-20", "time": "21:36:43", "iso8601_micro": "2024-09-21T01:36:43.472797Z", "iso8601": "2024-09-21T01:36:43Z", "iso8601_basic": "20240920T213643472797", "iso8601_basic_short": "20240920T213643", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.47607421875, "5m": 0.43212890625, "15m": 0.2099609375}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2870, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 847, "free": 2870}, "nocache": {"free": 3474, "used": 243}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_s<<< 15794 1726882603.84869: stdout chunk (state=3): >>>erial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 557, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205476352, "block_size": 4096, "block_total": 64483404, "block_available": 61329462, "block_used": 3153942, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_local": {}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882603.85226: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 15794 1726882603.85235: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 15794 1726882603.85238: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases <<< 15794 1726882603.85260: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants <<< 15794 1726882603.85291: stdout chunk (state=3): >>># cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 15794 1726882603.85363: stdout chunk (state=3): >>># cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing <<< 15794 1726882603.85390: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils <<< 15794 1726882603.85417: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils <<< 15794 1726882603.85483: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors<<< 15794 1726882603.85696: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15794 1726882603.85937: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15794 1726882603.85942: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15794 1726882603.85999: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 15794 1726882603.86002: stdout chunk (state=3): >>># destroy binascii <<< 15794 1726882603.86005: stdout chunk (state=3): >>># destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 15794 1726882603.86048: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 15794 1726882603.86053: stdout chunk (state=3): >>># destroy ipaddress <<< 15794 1726882603.86136: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15794 1726882603.86261: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 15794 1726882603.86283: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 15794 1726882603.86367: stdout chunk (state=3): >>># destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 15794 1726882603.86400: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 15794 1726882603.86404: stdout chunk (state=3): >>># destroy json <<< 15794 1726882603.86421: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 15794 1726882603.86454: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context <<< 15794 1726882603.86584: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 15794 1726882603.86696: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 15794 1726882603.86728: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15794 1726882603.86881: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15794 1726882603.86937: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15794 1726882603.86970: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15794 1726882603.86995: stdout chunk (state=3): >>># destroy _typing <<< 15794 1726882603.87048: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15794 1726882603.87263: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15794 1726882603.87835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882603.88092: stdout chunk (state=3): >>><<< 15794 1726882603.88095: stderr chunk (state=3): >>><<< 15794 1726882603.88356: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e232c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e22fbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e232eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e20fd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e20fdfd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e213bda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e213bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21737d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2173e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2153a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2151190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2138f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21976b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21962d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2152030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e213ae40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c8770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21381d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21c8c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c8ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21c8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e2136d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21c91f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21ca420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e4620> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21e5d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e6c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21e72c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e61b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e21e7d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21e7470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21ca390> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1edfcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1f08770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f084d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1f087a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1f08980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1edde50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f0a000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f08c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e21cab40> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f3a360> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f52510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f8b2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1fb1a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f8b410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f531a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1dd03e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f51550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1f0af00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa8e1dd05c0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_445g8ccm/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e36000> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e0cef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1dd3f80> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e0fe60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1e659a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e65730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e65040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e65790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e36a20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1e666c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1e66840> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1e66d50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cccb30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1cce7e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1ccf1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd0380> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd2db0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1cd2ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd10a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd6bd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd56d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd5430> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd7b30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1cd15b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d1ad20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d1ca10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1c7d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d1ef90> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1d100> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2a780> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1f110> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2b560> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2b9b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2ba40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d1b140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2e9f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2fdd0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2d190> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1d2e090> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2cd40> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bb8080> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bb9520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d2e540> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bb8d70> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bbb770> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bc1a60> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bc23c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1d33260> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1bc1100> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bc2570> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c567b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bcc3e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bc65a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bc63c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5cd10> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1147d70> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1148110> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bd4e30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1bd4380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5eba0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5e5a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e114b170> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e114aa20> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e114ac00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1149e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e114b290> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e11bddc0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e114bda0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1c5e5d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e11befc0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e11be930> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e11ea0f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e11daed0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e1005c70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1005880> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8e102fb30> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e102d190> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e102f1d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1074d70> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1075bb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e10c4320> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8e1077fb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "43", "epoch": "1726882603", "epoch_int": "1726882603", "date": "2024-09-20", "time": "21:36:43", "iso8601_micro": "2024-09-21T01:36:43.472797Z", "iso8601": "2024-09-21T01:36:43Z", "iso8601_basic": "20240920T213643472797", "iso8601_basic_short": "20240920T213643", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.47607421875, "5m": 0.43212890625, "15m": 0.2099609375}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2870, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 847, "free": 2870}, "nocache": {"free": 3474, "used": 243}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 557, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205476352, "block_size": 4096, "block_total": 64483404, "block_available": 61329462, "block_used": 3153942, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_local": {}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15794 1726882603.92842: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882603.92855: _low_level_execute_command(): starting 15794 1726882603.92859: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882602.4910913-15810-192487628213376/ > /dev/null 2>&1 && sleep 0' 15794 1726882603.94114: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882603.94397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882603.94413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882603.94429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882603.94524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882603.96546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882603.96699: stderr chunk (state=3): >>><<< 15794 1726882603.96708: stdout chunk (state=3): >>><<< 15794 1726882603.96730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882603.96748: handler run complete 15794 1726882603.97186: variable 'ansible_facts' from source: unknown 15794 1726882603.97471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882603.98464: variable 'ansible_facts' from source: unknown 15794 1726882603.98862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882603.99156: attempt loop complete, returning result 15794 1726882603.99167: _execute() done 15794 1726882603.99176: dumping result to json 15794 1726882603.99277: done dumping result, returning 15794 1726882603.99301: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-00000000007c] 15794 1726882603.99310: sending task result for task 0affe814-3a2d-94e5-e48f-00000000007c 15794 1726882604.00065: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000007c 15794 1726882604.00068: WORKER PROCESS EXITING ok: [managed_node1] 15794 1726882604.01551: no more pending results, returning what we have 15794 1726882604.01555: results queue empty 15794 1726882604.01556: checking for any_errors_fatal 15794 1726882604.01558: done checking for any_errors_fatal 15794 1726882604.01559: checking for max_fail_percentage 15794 1726882604.01561: done checking for max_fail_percentage 15794 1726882604.01561: checking to see if all hosts have failed and the running result is not ok 15794 1726882604.01563: done checking to see if all hosts have failed 15794 1726882604.01564: getting the remaining hosts for this loop 15794 1726882604.01566: done getting the remaining hosts for this loop 15794 1726882604.01569: getting the next task for host managed_node1 15794 1726882604.01576: done getting next task for host managed_node1 15794 1726882604.01580: ^ task is: TASK: meta (flush_handlers) 15794 1726882604.01583: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882604.01587: getting variables 15794 1726882604.01589: in VariableManager get_vars() 15794 1726882604.01613: Calling all_inventory to load vars for managed_node1 15794 1726882604.01615: Calling groups_inventory to load vars for managed_node1 15794 1726882604.01619: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882604.01629: Calling all_plugins_play to load vars for managed_node1 15794 1726882604.01632: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882604.01637: Calling groups_plugins_play to load vars for managed_node1 15794 1726882604.02331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.02797: done with get_vars() 15794 1726882604.02811: done getting variables 15794 1726882604.03096: in VariableManager get_vars() 15794 1726882604.03107: Calling all_inventory to load vars for managed_node1 15794 1726882604.03110: Calling groups_inventory to load vars for managed_node1 15794 1726882604.03114: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882604.03120: Calling all_plugins_play to load vars for managed_node1 15794 1726882604.03123: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882604.03127: Calling groups_plugins_play to load vars for managed_node1 15794 1726882604.03438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.04003: done with get_vars() 15794 1726882604.04019: done queuing things up, now waiting for results queue to drain 15794 1726882604.04022: results queue empty 15794 1726882604.04023: checking for any_errors_fatal 15794 1726882604.04026: done checking for any_errors_fatal 15794 1726882604.04027: checking for max_fail_percentage 15794 1726882604.04028: done checking for max_fail_percentage 15794 1726882604.04029: checking to see if all hosts have failed and the running result is not ok 15794 1726882604.04030: done checking to see if all hosts have failed 15794 1726882604.04240: getting the remaining hosts for this loop 15794 1726882604.04242: done getting the remaining hosts for this loop 15794 1726882604.04246: getting the next task for host managed_node1 15794 1726882604.04252: done getting next task for host managed_node1 15794 1726882604.04255: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15794 1726882604.04257: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882604.04259: getting variables 15794 1726882604.04261: in VariableManager get_vars() 15794 1726882604.04271: Calling all_inventory to load vars for managed_node1 15794 1726882604.04274: Calling groups_inventory to load vars for managed_node1 15794 1726882604.04277: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882604.04285: Calling all_plugins_play to load vars for managed_node1 15794 1726882604.04288: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882604.04292: Calling groups_plugins_play to load vars for managed_node1 15794 1726882604.04686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.05149: done with get_vars() 15794 1726882604.05159: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 21:36:44 -0400 (0:00:01.603) 0:00:01.612 ****** 15794 1726882604.05454: entering _queue_task() for managed_node1/include_tasks 15794 1726882604.05456: Creating lock for include_tasks 15794 1726882604.06001: worker is 1 (out of 1 available) 15794 1726882604.06014: exiting _queue_task() for managed_node1/include_tasks 15794 1726882604.06027: done queuing things up, now waiting for results queue to drain 15794 1726882604.06029: waiting for pending results... 15794 1726882604.06553: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 15794 1726882604.06686: in run() - task 0affe814-3a2d-94e5-e48f-000000000006 15794 1726882604.06788: variable 'ansible_search_path' from source: unknown 15794 1726882604.06858: calling self._execute() 15794 1726882604.06940: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.07056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.07068: variable 'omit' from source: magic vars 15794 1726882604.07301: _execute() done 15794 1726882604.07306: dumping result to json 15794 1726882604.07309: done dumping result, returning 15794 1726882604.07318: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affe814-3a2d-94e5-e48f-000000000006] 15794 1726882604.07328: sending task result for task 0affe814-3a2d-94e5-e48f-000000000006 15794 1726882604.07686: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000006 15794 1726882604.07691: WORKER PROCESS EXITING 15794 1726882604.07737: no more pending results, returning what we have 15794 1726882604.07742: in VariableManager get_vars() 15794 1726882604.07771: Calling all_inventory to load vars for managed_node1 15794 1726882604.07775: Calling groups_inventory to load vars for managed_node1 15794 1726882604.07781: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882604.07792: Calling all_plugins_play to load vars for managed_node1 15794 1726882604.07797: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882604.07801: Calling groups_plugins_play to load vars for managed_node1 15794 1726882604.08253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.08628: done with get_vars() 15794 1726882604.08850: variable 'ansible_search_path' from source: unknown 15794 1726882604.08866: we have included files to process 15794 1726882604.08867: generating all_blocks data 15794 1726882604.08869: done generating all_blocks data 15794 1726882604.08870: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15794 1726882604.08872: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15794 1726882604.08875: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15794 1726882604.10642: in VariableManager get_vars() 15794 1726882604.10661: done with get_vars() 15794 1726882604.10676: done processing included file 15794 1726882604.10681: iterating over new_blocks loaded from include file 15794 1726882604.10683: in VariableManager get_vars() 15794 1726882604.10694: done with get_vars() 15794 1726882604.10696: filtering new block on tags 15794 1726882604.10713: done filtering new block on tags 15794 1726882604.10717: in VariableManager get_vars() 15794 1726882604.10953: done with get_vars() 15794 1726882604.10955: filtering new block on tags 15794 1726882604.10976: done filtering new block on tags 15794 1726882604.10982: in VariableManager get_vars() 15794 1726882604.10995: done with get_vars() 15794 1726882604.10997: filtering new block on tags 15794 1726882604.11013: done filtering new block on tags 15794 1726882604.11016: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 15794 1726882604.11022: extending task lists for all hosts with included blocks 15794 1726882604.11089: done extending task lists 15794 1726882604.11091: done processing included files 15794 1726882604.11092: results queue empty 15794 1726882604.11093: checking for any_errors_fatal 15794 1726882604.11094: done checking for any_errors_fatal 15794 1726882604.11095: checking for max_fail_percentage 15794 1726882604.11097: done checking for max_fail_percentage 15794 1726882604.11097: checking to see if all hosts have failed and the running result is not ok 15794 1726882604.11098: done checking to see if all hosts have failed 15794 1726882604.11099: getting the remaining hosts for this loop 15794 1726882604.11101: done getting the remaining hosts for this loop 15794 1726882604.11104: getting the next task for host managed_node1 15794 1726882604.11109: done getting next task for host managed_node1 15794 1726882604.11111: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15794 1726882604.11114: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882604.11116: getting variables 15794 1726882604.11118: in VariableManager get_vars() 15794 1726882604.11127: Calling all_inventory to load vars for managed_node1 15794 1726882604.11130: Calling groups_inventory to load vars for managed_node1 15794 1726882604.11133: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882604.11343: Calling all_plugins_play to load vars for managed_node1 15794 1726882604.11347: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882604.11351: Calling groups_plugins_play to load vars for managed_node1 15794 1726882604.11743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.12216: done with get_vars() 15794 1726882604.12226: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:36:44 -0400 (0:00:00.068) 0:00:01.681 ****** 15794 1726882604.12305: entering _queue_task() for managed_node1/setup 15794 1726882604.12690: worker is 1 (out of 1 available) 15794 1726882604.12701: exiting _queue_task() for managed_node1/setup 15794 1726882604.12715: done queuing things up, now waiting for results queue to drain 15794 1726882604.12716: waiting for pending results... 15794 1726882604.13053: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 15794 1726882604.13059: in run() - task 0affe814-3a2d-94e5-e48f-00000000008d 15794 1726882604.13083: variable 'ansible_search_path' from source: unknown 15794 1726882604.13091: variable 'ansible_search_path' from source: unknown 15794 1726882604.13135: calling self._execute() 15794 1726882604.13237: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.13252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.13273: variable 'omit' from source: magic vars 15794 1726882604.14006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882604.16703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882604.16810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882604.16860: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882604.16983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882604.16986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882604.17068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882604.17119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882604.17164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882604.17230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882604.17258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882604.17517: variable 'ansible_facts' from source: unknown 15794 1726882604.17616: variable 'network_test_required_facts' from source: task vars 15794 1726882604.17681: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15794 1726882604.17745: variable 'omit' from source: magic vars 15794 1726882604.17751: variable 'omit' from source: magic vars 15794 1726882604.17804: variable 'omit' from source: magic vars 15794 1726882604.17932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882604.17972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882604.18004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882604.18068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882604.18151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882604.18174: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882604.18260: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.18265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.18480: Set connection var ansible_connection to ssh 15794 1726882604.18484: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882604.18554: Set connection var ansible_pipelining to False 15794 1726882604.18564: Set connection var ansible_shell_executable to /bin/sh 15794 1726882604.18572: Set connection var ansible_shell_type to sh 15794 1726882604.18596: Set connection var ansible_timeout to 10 15794 1726882604.18676: variable 'ansible_shell_executable' from source: unknown 15794 1726882604.18700: variable 'ansible_connection' from source: unknown 15794 1726882604.18707: variable 'ansible_module_compression' from source: unknown 15794 1726882604.18714: variable 'ansible_shell_type' from source: unknown 15794 1726882604.18771: variable 'ansible_shell_executable' from source: unknown 15794 1726882604.18774: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.18776: variable 'ansible_pipelining' from source: unknown 15794 1726882604.18781: variable 'ansible_timeout' from source: unknown 15794 1726882604.18783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.19240: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882604.19248: variable 'omit' from source: magic vars 15794 1726882604.19254: starting attempt loop 15794 1726882604.19257: running the handler 15794 1726882604.19259: _low_level_execute_command(): starting 15794 1726882604.19261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882604.20144: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.20160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.20261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.22051: stdout chunk (state=3): >>>/root <<< 15794 1726882604.22194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882604.22509: stderr chunk (state=3): >>><<< 15794 1726882604.22513: stdout chunk (state=3): >>><<< 15794 1726882604.22517: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882604.22526: _low_level_execute_command(): starting 15794 1726882604.22529: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907 `" && echo ansible-tmp-1726882604.2241104-15852-203442394300907="` echo /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907 `" ) && sleep 0' 15794 1726882604.23057: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882604.23151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.23186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882604.23202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.23226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.23436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.25368: stdout chunk (state=3): >>>ansible-tmp-1726882604.2241104-15852-203442394300907=/root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907 <<< 15794 1726882604.25530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882604.25849: stderr chunk (state=3): >>><<< 15794 1726882604.25852: stdout chunk (state=3): >>><<< 15794 1726882604.25876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882604.2241104-15852-203442394300907=/root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882604.25935: variable 'ansible_module_compression' from source: unknown 15794 1726882604.25990: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882604.26185: variable 'ansible_facts' from source: unknown 15794 1726882604.26641: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py 15794 1726882604.27063: Sending initial data 15794 1726882604.27066: Sent initial data (154 bytes) 15794 1726882604.27602: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882604.27617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882604.27632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882604.27746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.27767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.27860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.29529: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882604.29604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882604.29685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp1f2j33h8 /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py <<< 15794 1726882604.29690: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py" <<< 15794 1726882604.29966: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp1f2j33h8" to remote "/root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py" <<< 15794 1726882604.32434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882604.32535: stderr chunk (state=3): >>><<< 15794 1726882604.32547: stdout chunk (state=3): >>><<< 15794 1726882604.32580: done transferring module to remote 15794 1726882604.32615: _low_level_execute_command(): starting 15794 1726882604.32631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/ /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py && sleep 0' 15794 1726882604.33571: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882604.33575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882604.33581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882604.33622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.33716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882604.33748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.33803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.33885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.35850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882604.35883: stderr chunk (state=3): >>><<< 15794 1726882604.35896: stdout chunk (state=3): >>><<< 15794 1726882604.35919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882604.35929: _low_level_execute_command(): starting 15794 1726882604.36013: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/AnsiballZ_setup.py && sleep 0' 15794 1726882604.36517: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882604.36533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882604.36555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882604.36573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882604.36588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882604.36602: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882604.36618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.36641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882604.36656: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882604.36752: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882604.36773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.36790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.37055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.39213: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15794 1726882604.39250: stdout chunk (state=3): >>>import _imp # builtin <<< 15794 1726882604.39277: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15794 1726882604.39356: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15794 1726882604.39401: stdout chunk (state=3): >>>import 'posix' # <<< 15794 1726882604.39429: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 15794 1726882604.39461: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 15794 1726882604.39484: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 15794 1726882604.39529: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15794 1726882604.39562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.39581: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 15794 1726882604.39618: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15794 1726882604.39655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15794 1726882604.39673: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484cd4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ca3b30> <<< 15794 1726882604.39705: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15794 1726882604.39731: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484cd6ab0> <<< 15794 1726882604.39736: stdout chunk (state=3): >>>import '_signal' # <<< 15794 1726882604.39752: stdout chunk (state=3): >>>import '_abc' # <<< 15794 1726882604.39767: stdout chunk (state=3): >>>import 'abc' # <<< 15794 1726882604.39785: stdout chunk (state=3): >>>import 'io' # <<< 15794 1726882604.39814: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15794 1726882604.39913: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15794 1726882604.39942: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15794 1726882604.39971: stdout chunk (state=3): >>>import 'os' # <<< 15794 1726882604.40025: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 15794 1726882604.40029: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15794 1726882604.40072: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15794 1726882604.40094: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15794 1726882604.40097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484a85160> <<< 15794 1726882604.40171: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15794 1726882604.40174: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.40198: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484a85fd0> <<< 15794 1726882604.40217: stdout chunk (state=3): >>>import 'site' # <<< 15794 1726882604.40246: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15794 1726882604.40770: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15794 1726882604.40773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15794 1726882604.40819: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15794 1726882604.40864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac3e90> <<< 15794 1726882604.40977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15794 1726882604.41106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15794 1726882604.41111: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac3f50> <<< 15794 1726882604.41114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15794 1726882604.41117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15794 1726882604.41119: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15794 1726882604.41121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.41124: stdout chunk (state=3): >>>import 'itertools' # <<< 15794 1726882604.41126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484afb8c0> <<< 15794 1726882604.41128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15794 1726882604.41131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484afbf50> <<< 15794 1726882604.41221: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484adbb60> import '_functools' # <<< 15794 1726882604.41225: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ad9280> <<< 15794 1726882604.41312: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac1040> <<< 15794 1726882604.41362: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15794 1726882604.41406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15794 1726882604.41451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15794 1726882604.41454: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15794 1726882604.41522: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b1f800> <<< 15794 1726882604.41526: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b1e420> <<< 15794 1726882604.41553: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ada150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b1ccb0> <<< 15794 1726882604.41619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b50890> <<< 15794 1726882604.41666: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15794 1726882604.41682: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b50d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b50bf0> <<< 15794 1726882604.41752: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.41757: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b50fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484abede0> <<< 15794 1726882604.41788: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15794 1726882604.41868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15794 1726882604.41872: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b51670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b51340> <<< 15794 1726882604.41925: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b52510> <<< 15794 1726882604.41929: stdout chunk (state=3): >>>import 'importlib.util' # <<< 15794 1726882604.41968: stdout chunk (state=3): >>>import 'runpy' # <<< 15794 1726882604.41973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15794 1726882604.42023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15794 1726882604.42060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b68740> <<< 15794 1726882604.42064: stdout chunk (state=3): >>>import 'errno' # <<< 15794 1726882604.42094: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b69e20> <<< 15794 1726882604.42143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15794 1726882604.42156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b6acf0> <<< 15794 1726882604.42194: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b6b350> <<< 15794 1726882604.42237: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b6a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15794 1726882604.42281: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.42293: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b6bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b6b500> <<< 15794 1726882604.42357: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b52570> <<< 15794 1726882604.42383: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15794 1726882604.42421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15794 1726882604.42439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15794 1726882604.42479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15794 1726882604.42527: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48487fc80> <<< 15794 1726882604.42553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848a87a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848a8500> <<< 15794 1726882604.42605: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848a87d0> <<< 15794 1726882604.42625: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848a89b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48487de20> <<< 15794 1726882604.42647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15794 1726882604.42753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15794 1726882604.42779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15794 1726882604.42822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848aa0c0> <<< 15794 1726882604.42827: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848a8d40> <<< 15794 1726882604.42867: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b52c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15794 1726882604.42917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.42937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15794 1726882604.42977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15794 1726882604.43004: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848d6420> <<< 15794 1726882604.43086: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15794 1726882604.43106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15794 1726882604.43122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15794 1726882604.43175: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848f2540> <<< 15794 1726882604.43220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15794 1726882604.43232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15794 1726882604.43286: stdout chunk (state=3): >>>import 'ntpath' # <<< 15794 1726882604.43327: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48492b2c0> <<< 15794 1726882604.43357: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15794 1726882604.43393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15794 1726882604.43411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15794 1726882604.43450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15794 1726882604.43551: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484951a60> <<< 15794 1726882604.43624: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48492b3e0> <<< 15794 1726882604.43665: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848f31d0> <<< 15794 1726882604.43699: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484770380> <<< 15794 1726882604.43725: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848f1580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848aafc0> <<< 15794 1726882604.43896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15794 1726882604.43911: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff484770560> <<< 15794 1726882604.44095: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_nqxecevi/ansible_setup_payload.zip' # zipimport: zlib available <<< 15794 1726882604.44244: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.44277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15794 1726882604.44295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15794 1726882604.44329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15794 1726882604.44405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15794 1726882604.44458: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847d9fd0> <<< 15794 1726882604.44461: stdout chunk (state=3): >>>import '_typing' # <<< 15794 1726882604.44657: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847b0ec0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484773f50> <<< 15794 1726882604.44682: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.44719: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15794 1726882604.44763: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.44767: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 15794 1726882604.44769: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.46308: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.47600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847b3e60> <<< 15794 1726882604.47624: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15794 1726882604.47641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.47675: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15794 1726882604.47678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15794 1726882604.47707: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15794 1726882604.47743: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.47746: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848099a0> <<< 15794 1726882604.47771: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484809730> <<< 15794 1726882604.47807: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484809040> <<< 15794 1726882604.47832: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15794 1726882604.47883: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484809a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847dac60> <<< 15794 1726882604.47916: stdout chunk (state=3): >>>import 'atexit' # <<< 15794 1726882604.47929: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48480a750> <<< 15794 1726882604.47963: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48480a990> <<< 15794 1726882604.47976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15794 1726882604.48018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15794 1726882604.48035: stdout chunk (state=3): >>>import '_locale' # <<< 15794 1726882604.48091: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48480aed0> <<< 15794 1726882604.48110: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15794 1726882604.48133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15794 1726882604.48184: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484670b90> <<< 15794 1726882604.48222: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846727b0> <<< 15794 1726882604.48235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15794 1726882604.48249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15794 1726882604.48303: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484673140> <<< 15794 1726882604.48306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15794 1726882604.48357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15794 1726882604.48360: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484674320> <<< 15794 1726882604.48372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15794 1726882604.48404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15794 1726882604.48439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 15794 1726882604.48450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15794 1726882604.48493: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484676db0> <<< 15794 1726882604.48538: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484677110> <<< 15794 1726882604.48567: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484675070> <<< 15794 1726882604.48577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15794 1726882604.48627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15794 1726882604.48632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 15794 1726882604.48655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15794 1726882604.48695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15794 1726882604.48728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15794 1726882604.48743: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48467ade0> import '_tokenize' # <<< 15794 1726882604.48825: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846798b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484679610> <<< 15794 1726882604.48841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15794 1726882604.48920: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48467bd10> <<< 15794 1726882604.48948: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484675580> <<< 15794 1726882604.48991: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846beed0> <<< 15794 1726882604.49039: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846bf050> <<< 15794 1726882604.49042: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15794 1726882604.49082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15794 1726882604.49085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15794 1726882604.49124: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846c0c20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846c09e0> <<< 15794 1726882604.49139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15794 1726882604.49250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15794 1726882604.49304: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846c3170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846c12e0> <<< 15794 1726882604.49334: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15794 1726882604.49385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.49412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15794 1726882604.49427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15794 1726882604.49472: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846ce960> <<< 15794 1726882604.49626: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846c32f0> <<< 15794 1726882604.49708: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846cfc20> <<< 15794 1726882604.49751: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.49756: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846cf9e0> <<< 15794 1726882604.49826: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846cfc50> <<< 15794 1726882604.49830: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846bf350> <<< 15794 1726882604.49862: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15794 1726882604.49871: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15794 1726882604.49893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15794 1726882604.49919: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.49950: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846d33b0> <<< 15794 1726882604.50152: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.50170: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846d4530> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846d1b20> <<< 15794 1726882604.50229: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846d2ea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846d1700> <<< 15794 1726882604.50237: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.50262: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15794 1726882604.50362: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.50507: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 15794 1726882604.50531: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15794 1726882604.50547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.50680: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.50827: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.51496: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.52194: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15794 1726882604.52212: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15794 1726882604.52258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.52293: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.52316: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48455c5f0> <<< 15794 1726882604.52429: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15794 1726882604.52446: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455d400> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846d7aa0> <<< 15794 1726882604.52516: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15794 1726882604.52520: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.52556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15794 1726882604.52740: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.52939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15794 1726882604.52954: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455d4f0> # zipimport: zlib available <<< 15794 1726882604.53513: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54066: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54153: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54242: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15794 1726882604.54255: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54303: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54343: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15794 1726882604.54441: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54587: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15794 1726882604.54592: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 15794 1726882604.54649: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54652: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.54691: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15794 1726882604.54980: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.55267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15794 1726882604.55347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15794 1726882604.55351: stdout chunk (state=3): >>>import '_ast' # <<< 15794 1726882604.55439: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455ffe0> # zipimport: zlib available <<< 15794 1726882604.55530: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.55625: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15794 1726882604.55664: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15794 1726882604.55690: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15794 1726882604.55752: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.55873: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4845660c0> <<< 15794 1726882604.55953: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484566a20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455ec00> # zipimport: zlib available <<< 15794 1726882604.56009: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56074: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15794 1726882604.56077: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56131: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56156: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56214: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56473: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15794 1726882604.56476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4845657c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484566c90> <<< 15794 1726882604.56528: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15794 1726882604.56531: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56596: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56657: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56688: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.56732: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882604.56755: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15794 1726882604.56776: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15794 1726882604.56805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15794 1726882604.56863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15794 1726882604.56895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15794 1726882604.56914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15794 1726882604.56963: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845f6cc0> <<< 15794 1726882604.57013: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484573b60> <<< 15794 1726882604.57104: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48456aae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48456a930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15794 1726882604.57115: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57143: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57175: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15794 1726882604.57266: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15794 1726882604.57270: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15794 1726882604.57294: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57352: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57432: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57446: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57463: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57502: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57551: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57590: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57640: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15794 1726882604.57650: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57727: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57810: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57838: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.57882: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 15794 1726882604.57886: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.58084: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.58280: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.58320: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.58386: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 15794 1726882604.58415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15794 1726882604.58439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15794 1726882604.58463: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15794 1726882604.58485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15794 1726882604.58520: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845fd9d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15794 1726882604.58551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15794 1726882604.58554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15794 1726882604.58598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15794 1726882604.58626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15794 1726882604.58646: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4c2f0> <<< 15794 1726882604.58683: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.58700: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483b4c620> <<< 15794 1726882604.58752: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845e53a0> <<< 15794 1726882604.58790: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845e46b0> <<< 15794 1726882604.58811: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845fc170> <<< 15794 1726882604.58827: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845ff920> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15794 1726882604.58916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15794 1726882604.58932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15794 1726882604.58957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15794 1726882604.58996: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.59032: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483b4f500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4edb0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.59070: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483b4ef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4e1e0> <<< 15794 1726882604.59076: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15794 1726882604.59191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15794 1726882604.59215: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4f470> <<< 15794 1726882604.59228: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15794 1726882604.59254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15794 1726882604.59284: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483bb6030> <<< 15794 1726882604.59317: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4ff50> <<< 15794 1726882604.59358: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845fd160> <<< 15794 1726882604.59408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15794 1726882604.59411: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15794 1726882604.59428: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59485: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15794 1726882604.59564: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59614: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 15794 1726882604.59712: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15794 1726882604.59756: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 15794 1726882604.59797: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59847: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15794 1726882604.59916: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.59954: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.60011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15794 1726882604.60014: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.60071: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.60132: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.60195: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.60273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 15794 1726882604.60284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.60864: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 15794 1726882604.61374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61442: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61464: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61509: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 15794 1726882604.61554: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61557: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15794 1726882604.61599: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61713: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15794 1726882604.61748: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61771: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15794 1726882604.61839: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15794 1726882604.61889: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.61971: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.62082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15794 1726882604.62086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15794 1726882604.62139: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483bb60c0> <<< 15794 1726882604.62143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15794 1726882604.62159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15794 1726882604.62292: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483bb6ab0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15794 1726882604.62374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.62439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15794 1726882604.62547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.62660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15794 1726882604.62664: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.62730: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.62809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 15794 1726882604.62856: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.62913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15794 1726882604.62963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15794 1726882604.63037: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882604.63106: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483bee2d0> <<< 15794 1726882604.63315: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483bdaf30> <<< 15794 1726882604.63327: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 15794 1726882604.63391: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.63451: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 15794 1726882604.63552: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.63645: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.63768: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.63944: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15794 1726882604.63960: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.63989: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64032: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15794 1726882604.64058: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64082: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64145: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 15794 1726882604.64214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15794 1726882604.64222: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48397db50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48397da90> <<< 15794 1726882604.64254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15794 1726882604.64302: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15794 1726882604.64372: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64528: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15794 1726882604.64816: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64927: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.64974: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15794 1726882604.65043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15794 1726882604.65065: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65088: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65243: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15794 1726882604.65438: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 15794 1726882604.65731: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.65775: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.66401: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.66985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15794 1726882604.67001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15794 1726882604.67111: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.67227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15794 1726882604.67240: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.67345: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.67461: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15794 1726882604.67630: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.67884: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available <<< 15794 1726882604.67900: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 15794 1726882604.67961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15794 1726882604.68064: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68173: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68396: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15794 1726882604.68641: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68675: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15794 1726882604.68760: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68763: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15794 1726882604.68870: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.68960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 15794 1726882604.68983: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.69016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15794 1726882604.69029: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.69089: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.69156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15794 1726882604.69221: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.69300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15794 1726882604.69311: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.69591: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.70074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 15794 1726882604.70242: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 15794 1726882604.70291: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 15794 1726882604.70360: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.70456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15794 1726882604.70493: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15794 1726882604.70652: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 15794 1726882604.70655: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.70702: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.70753: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.70833: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.70959: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15794 1726882604.70995: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.71050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 15794 1726882604.71073: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.71285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.71501: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15794 1726882604.71524: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.71629: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 15794 1726882604.71669: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.71723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15794 1726882604.71743: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.71963: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 15794 1726882604.72019: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.72124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15794 1726882604.72206: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882604.73159: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15794 1726882604.73181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15794 1726882604.73204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 15794 1726882604.73246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15794 1726882604.73271: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4839ab320> <<< 15794 1726882604.73294: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4839abd10> <<< 15794 1726882604.73337: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4839ab3e0> <<< 15794 1726882604.73730: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "44", "epoch": "1726882604", "epoch_int": "1726882604", "date": "2024-09-20", "time": "21:36:44", "iso8601_micro": "2024-09-21T01:36:44.734824Z", "iso8601": "2024-09-21T01:36:44Z", "iso8601_basic": "20240920T213644734824", "iso8601_basic_short": "20240920T213644", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882604.74413: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 15794 1726882604.74459: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 15794 1726882604.74581: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 15794 1726882604.74585: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils <<< 15794 1726882604.74789: stdout chunk (state=3): >>># destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15794 1726882604.75096: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15794 1726882604.75100: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15794 1726882604.75443: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 15794 1726882604.75451: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 15794 1726882604.75486: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 15794 1726882604.75527: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 15794 1726882604.75763: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 15794 1726882604.75773: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 15794 1726882604.75897: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 15794 1726882604.75901: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15794 1726882604.76079: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15794 1726882604.76125: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 15794 1726882604.76351: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15794 1726882604.76365: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15794 1726882604.77121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882604.77125: stdout chunk (state=3): >>><<< 15794 1726882604.77127: stderr chunk (state=3): >>><<< 15794 1726882604.77275: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484cd4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ca3b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484cd6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484a85160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484a85fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484afb8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484afbf50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484adbb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ad9280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac1040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b1f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b1e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ada150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b1ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b50890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484ac02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b50d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b50bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b50fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484abede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b51670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b51340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b52510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b68740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b69e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b6acf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b6b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b6a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484b6bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b6b500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b52570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48487fc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848a87a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848a8500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848a87d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848a89b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48487de20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848aa0c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848a8d40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484b52c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848d6420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848f2540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48492b2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484951a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48492b3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848f31d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484770380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848f1580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4848aafc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff484770560> # zipimport: found 103 names in '/tmp/ansible_setup_payload_nqxecevi/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847d9fd0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847b0ec0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484773f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847b3e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4848099a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484809730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484809040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484809a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4847dac60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48480a750> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48480a990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48480aed0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484670b90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846727b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484673140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484674320> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484676db0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484677110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484675070> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48467ade0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846798b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484679610> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48467bd10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484675580> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846beed0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846bf050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846c0c20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846c09e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846c3170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846c12e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846ce960> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846c32f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846cfc20> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846cf9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846cfc50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846bf350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846d33b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846d4530> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846d1b20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4846d2ea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846d1700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48455c5f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455d400> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4846d7aa0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455d4f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455ffe0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4845660c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff484566a20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48455ec00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4845657c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484566c90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845f6cc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff484573b60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48456aae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48456a930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845fd9d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4c2f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483b4c620> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845e53a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845e46b0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845fc170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845ff920> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483b4f500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4edb0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483b4ef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4e1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4f470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483bb6030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483b4ff50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4845fd160> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483bb60c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483bb6ab0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff483bee2d0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff483bdaf30> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff48397db50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff48397da90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff4839ab320> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4839abd10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff4839ab3e0> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "44", "epoch": "1726882604", "epoch_int": "1726882604", "date": "2024-09-20", "time": "21:36:44", "iso8601_micro": "2024-09-21T01:36:44.734824Z", "iso8601": "2024-09-21T01:36:44Z", "iso8601_basic": "20240920T213644734824", "iso8601_basic_short": "20240920T213644", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15794 1726882604.79557: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882604.79561: _low_level_execute_command(): starting 15794 1726882604.79564: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882604.2241104-15852-203442394300907/ > /dev/null 2>&1 && sleep 0' 15794 1726882604.79595: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.79599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882604.79602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.79878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882604.79882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.79913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.79965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.81913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882604.81981: stderr chunk (state=3): >>><<< 15794 1726882604.81991: stdout chunk (state=3): >>><<< 15794 1726882604.82175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882604.82178: handler run complete 15794 1726882604.82181: variable 'ansible_facts' from source: unknown 15794 1726882604.82365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.82933: variable 'ansible_facts' from source: unknown 15794 1726882604.82937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.83088: attempt loop complete, returning result 15794 1726882604.83148: _execute() done 15794 1726882604.83264: dumping result to json 15794 1726882604.83267: done dumping result, returning 15794 1726882604.83270: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affe814-3a2d-94e5-e48f-00000000008d] 15794 1726882604.83272: sending task result for task 0affe814-3a2d-94e5-e48f-00000000008d ok: [managed_node1] 15794 1726882604.83852: no more pending results, returning what we have 15794 1726882604.83856: results queue empty 15794 1726882604.83857: checking for any_errors_fatal 15794 1726882604.83859: done checking for any_errors_fatal 15794 1726882604.83860: checking for max_fail_percentage 15794 1726882604.83863: done checking for max_fail_percentage 15794 1726882604.83864: checking to see if all hosts have failed and the running result is not ok 15794 1726882604.83865: done checking to see if all hosts have failed 15794 1726882604.83866: getting the remaining hosts for this loop 15794 1726882604.83868: done getting the remaining hosts for this loop 15794 1726882604.83873: getting the next task for host managed_node1 15794 1726882604.83883: done getting next task for host managed_node1 15794 1726882604.83887: ^ task is: TASK: Check if system is ostree 15794 1726882604.83890: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882604.83895: getting variables 15794 1726882604.83897: in VariableManager get_vars() 15794 1726882604.83927: Calling all_inventory to load vars for managed_node1 15794 1726882604.84387: Calling groups_inventory to load vars for managed_node1 15794 1726882604.84392: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882604.84405: Calling all_plugins_play to load vars for managed_node1 15794 1726882604.84408: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882604.84413: Calling groups_plugins_play to load vars for managed_node1 15794 1726882604.84630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882604.85241: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000008d 15794 1726882604.85245: WORKER PROCESS EXITING 15794 1726882604.85480: done with get_vars() 15794 1726882604.85493: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:36:44 -0400 (0:00:00.732) 0:00:02.414 ****** 15794 1726882604.85605: entering _queue_task() for managed_node1/stat 15794 1726882604.86417: worker is 1 (out of 1 available) 15794 1726882604.86431: exiting _queue_task() for managed_node1/stat 15794 1726882604.86449: done queuing things up, now waiting for results queue to drain 15794 1726882604.86451: waiting for pending results... 15794 1726882604.86867: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 15794 1726882604.86967: in run() - task 0affe814-3a2d-94e5-e48f-00000000008f 15794 1726882604.86980: variable 'ansible_search_path' from source: unknown 15794 1726882604.86984: variable 'ansible_search_path' from source: unknown 15794 1726882604.87024: calling self._execute() 15794 1726882604.87430: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.87437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.87481: variable 'omit' from source: magic vars 15794 1726882604.88799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882604.89419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882604.89591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882604.89631: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882604.89686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882604.89903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882604.89933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882604.90198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882604.90202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882604.90416: Evaluated conditional (not __network_is_ostree is defined): True 15794 1726882604.90632: variable 'omit' from source: magic vars 15794 1726882604.90636: variable 'omit' from source: magic vars 15794 1726882604.90640: variable 'omit' from source: magic vars 15794 1726882604.90764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882604.90801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882604.90825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882604.90873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882604.90959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882604.90995: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882604.91139: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.91143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.91395: Set connection var ansible_connection to ssh 15794 1726882604.91399: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882604.91401: Set connection var ansible_pipelining to False 15794 1726882604.91404: Set connection var ansible_shell_executable to /bin/sh 15794 1726882604.91406: Set connection var ansible_shell_type to sh 15794 1726882604.91409: Set connection var ansible_timeout to 10 15794 1726882604.91438: variable 'ansible_shell_executable' from source: unknown 15794 1726882604.91510: variable 'ansible_connection' from source: unknown 15794 1726882604.91520: variable 'ansible_module_compression' from source: unknown 15794 1726882604.91528: variable 'ansible_shell_type' from source: unknown 15794 1726882604.91539: variable 'ansible_shell_executable' from source: unknown 15794 1726882604.91547: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882604.91556: variable 'ansible_pipelining' from source: unknown 15794 1726882604.91564: variable 'ansible_timeout' from source: unknown 15794 1726882604.91573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882604.91983: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882604.92001: variable 'omit' from source: magic vars 15794 1726882604.92013: starting attempt loop 15794 1726882604.92022: running the handler 15794 1726882604.92243: _low_level_execute_command(): starting 15794 1726882604.92247: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882604.93695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882604.93714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882604.93801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.94133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882604.94150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.94221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882604.96017: stdout chunk (state=3): >>>/root <<< 15794 1726882604.96246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882604.96250: stdout chunk (state=3): >>><<< 15794 1726882604.96252: stderr chunk (state=3): >>><<< 15794 1726882604.96488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882604.96499: _low_level_execute_command(): starting 15794 1726882604.96503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166 `" && echo ansible-tmp-1726882604.9633741-15874-111163882840166="` echo /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166 `" ) && sleep 0' 15794 1726882604.98470: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882604.98493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882604.98593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882604.98863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882604.98867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882604.98940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882605.01017: stdout chunk (state=3): >>>ansible-tmp-1726882604.9633741-15874-111163882840166=/root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166 <<< 15794 1726882605.01229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882605.01245: stdout chunk (state=3): >>><<< 15794 1726882605.01260: stderr chunk (state=3): >>><<< 15794 1726882605.01444: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882604.9633741-15874-111163882840166=/root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882605.01447: variable 'ansible_module_compression' from source: unknown 15794 1726882605.01590: ANSIBALLZ: Using lock for stat 15794 1726882605.01601: ANSIBALLZ: Acquiring lock 15794 1726882605.01740: ANSIBALLZ: Lock acquired: 139758818401680 15794 1726882605.01744: ANSIBALLZ: Creating module 15794 1726882605.25509: ANSIBALLZ: Writing module into payload 15794 1726882605.25653: ANSIBALLZ: Writing module 15794 1726882605.25688: ANSIBALLZ: Renaming module 15794 1726882605.25701: ANSIBALLZ: Done creating module 15794 1726882605.25722: variable 'ansible_facts' from source: unknown 15794 1726882605.25811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py 15794 1726882605.25959: Sending initial data 15794 1726882605.26086: Sent initial data (153 bytes) 15794 1726882605.26611: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882605.26628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882605.26740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882605.26957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882605.27232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882605.28977: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15794 1726882605.28996: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15794 1726882605.29011: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15794 1726882605.29033: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882605.29096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882605.29274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py" <<< 15794 1726882605.29280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpznhhahwy /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py <<< 15794 1726882605.29325: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpznhhahwy" to remote "/root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py" <<< 15794 1726882605.31581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882605.31593: stdout chunk (state=3): >>><<< 15794 1726882605.31616: stderr chunk (state=3): >>><<< 15794 1726882605.31974: done transferring module to remote 15794 1726882605.31977: _low_level_execute_command(): starting 15794 1726882605.31983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/ /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py && sleep 0' 15794 1726882605.33063: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882605.33117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882605.33151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882605.33169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882605.33183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882605.33189: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882605.33250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882605.33393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882605.33462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882605.33563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882605.35588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882605.35591: stdout chunk (state=3): >>><<< 15794 1726882605.35600: stderr chunk (state=3): >>><<< 15794 1726882605.35654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882605.35665: _low_level_execute_command(): starting 15794 1726882605.35676: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/AnsiballZ_stat.py && sleep 0' 15794 1726882605.36701: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882605.36715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882605.36726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882605.37066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882605.37168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882605.39362: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15794 1726882605.39391: stdout chunk (state=3): >>>import _imp # builtin <<< 15794 1726882605.39429: stdout chunk (state=3): >>>import '_thread' # <<< 15794 1726882605.39456: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 15794 1726882605.39576: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 15794 1726882605.39598: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15794 1726882605.39623: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15794 1726882605.39691: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15794 1726882605.39722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 15794 1726882605.39764: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15794 1726882605.39915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98f2c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98efbb30> <<< 15794 1726882605.39923: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98f2eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 15794 1726882605.39948: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15794 1726882605.40039: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15794 1726882605.40103: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15794 1726882605.40207: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 15794 1726882605.40220: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d41160> <<< 15794 1726882605.40276: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15794 1726882605.40300: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d41fd0> <<< 15794 1726882605.40323: stdout chunk (state=3): >>>import 'site' # <<< 15794 1726882605.40374: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15794 1726882605.40677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15794 1726882605.40685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15794 1726882605.40722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15794 1726882605.40953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7fe90> <<< 15794 1726882605.40956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15794 1726882605.40982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 15794 1726882605.41018: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98db7860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15794 1726882605.41046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98db7ef0> <<< 15794 1726882605.41058: stdout chunk (state=3): >>>import '_collections' # <<< 15794 1726882605.41099: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d97b30> <<< 15794 1726882605.41116: stdout chunk (state=3): >>>import '_functools' # <<< 15794 1726882605.41143: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d951f0> <<< 15794 1726882605.41243: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7d040> <<< 15794 1726882605.41289: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15794 1726882605.41362: stdout chunk (state=3): >>>import '_sre' # <<< 15794 1726882605.41376: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15794 1726882605.41479: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98ddb800> <<< 15794 1726882605.41487: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98dda420> <<< 15794 1726882605.41684: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98dd8c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7c2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e0ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e0cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.41688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15794 1726882605.41726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15794 1726882605.41748: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0d640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0d310> import 'importlib.machinery' # <<< 15794 1726882605.41784: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15794 1726882605.42169: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0e540> import 'importlib.util' # <<< 15794 1726882605.42298: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e24770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e25eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e26d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e273b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e262d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e27e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e27560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0e5a0> <<< 15794 1726882605.42330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15794 1726882605.42396: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98bbfcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15794 1726882605.42402: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882605.42406: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98be8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98be84a0> <<< 15794 1726882605.42433: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98be8770> <<< 15794 1726882605.42468: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98be8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98bbde50> <<< 15794 1726882605.42489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15794 1726882605.42675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98bea030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98be8cb0> <<< 15794 1726882605.42702: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0ec90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15794 1726882605.42755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.42777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15794 1726882605.42816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15794 1726882605.42856: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c1a390> <<< 15794 1726882605.42923: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.42945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15794 1726882605.42965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15794 1726882605.43032: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c32540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15794 1726882605.43067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15794 1726882605.43146: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 15794 1726882605.43170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c6b2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15794 1726882605.43205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15794 1726882605.43266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15794 1726882605.43280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15794 1726882605.43372: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c91a60> <<< 15794 1726882605.43450: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c6b3e0> <<< 15794 1726882605.43490: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c331d0> <<< 15794 1726882605.43576: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98aac2f0> <<< 15794 1726882605.43580: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c31580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98beaf30> <<< 15794 1726882605.43839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15794 1726882605.43843: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6b98aac500> <<< 15794 1726882605.43866: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_q5q4aeyj/ansible_stat_payload.zip' # zipimport: zlib available <<< 15794 1726882605.43900: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.43938: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15794 1726882605.43953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15794 1726882605.43987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15794 1726882605.44069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15794 1726882605.44106: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b01f70> <<< 15794 1726882605.44120: stdout chunk (state=3): >>>import '_typing' # <<< 15794 1726882605.44314: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98ad8e60> <<< 15794 1726882605.44335: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98aaffb0> # zipimport: zlib available <<< 15794 1726882605.44364: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15794 1726882605.44409: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15794 1726882605.44512: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 15794 1726882605.45970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.47235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15794 1726882605.47343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98adbe00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.47367: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98b299d0> <<< 15794 1726882605.47384: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b29760> <<< 15794 1726882605.47415: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b29070> <<< 15794 1726882605.47496: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15794 1726882605.47501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15794 1726882605.47504: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b294c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b02c00> <<< 15794 1726882605.47575: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98b2a7b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882605.47591: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98b2a9f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15794 1726882605.47713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15794 1726882605.47788: stdout chunk (state=3): >>>import '_locale' # <<< 15794 1726882605.47845: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b2af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b9898cc20> <<< 15794 1726882605.47849: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b9898e840> <<< 15794 1726882605.47868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15794 1726882605.47890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15794 1726882605.47978: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b9898f1d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15794 1726882605.47982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15794 1726882605.48038: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989903b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15794 1726882605.48179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15794 1726882605.48183: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98992e40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989931a0> <<< 15794 1726882605.48217: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98991100> <<< 15794 1726882605.48221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15794 1726882605.48243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15794 1726882605.48314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15794 1726882605.48354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15794 1726882605.48684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98996e40> import '_tokenize' # <<< 15794 1726882605.48688: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98995910> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98995670> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98997f50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98991610> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989deff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989df110> <<< 15794 1726882605.48690: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15794 1726882605.48721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15794 1726882605.48771: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989e0cb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989e0a70> <<< 15794 1726882605.48793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15794 1726882605.48901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15794 1726882605.48949: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989e3230> <<< 15794 1726882605.48973: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989e13a0> <<< 15794 1726882605.49000: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15794 1726882605.49026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.49065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15794 1726882605.49080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15794 1726882605.49128: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989eaa50> <<< 15794 1726882605.49285: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989e33e0> <<< 15794 1726882605.49362: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989eb860> <<< 15794 1726882605.49406: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989eba70> <<< 15794 1726882605.49475: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989ebe00> <<< 15794 1726882605.49479: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989df3b0> <<< 15794 1726882605.49524: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15794 1726882605.49527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15794 1726882605.49567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15794 1726882605.49584: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882605.49613: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882605.49629: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989eec90> <<< 15794 1726882605.49803: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882605.49859: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989f01d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989ed430> <<< 15794 1726882605.49951: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989ee7b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989ecfe0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15794 1726882605.49968: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.50051: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.50128: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.50176: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 15794 1726882605.50180: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.50210: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15794 1726882605.50354: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.50497: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.51180: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.51868: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15794 1726882605.51872: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 15794 1726882605.51911: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15794 1726882605.51969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.51987: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a74410> <<< 15794 1726882605.52138: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15794 1726882605.52195: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a751f0> <<< 15794 1726882605.52217: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989f3890> <<< 15794 1726882605.52255: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 15794 1726882605.52259: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15794 1726882605.52464: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.52629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15794 1726882605.52655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a74fb0> <<< 15794 1726882605.52676: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.53230: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.53788: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.53882: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.53960: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15794 1726882605.53970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54016: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54069: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15794 1726882605.54073: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54153: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54286: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15794 1726882605.54290: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54324: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15794 1726882605.54366: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54419: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15794 1726882605.54700: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.54993: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15794 1726882605.55069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15794 1726882605.55089: stdout chunk (state=3): >>>import '_ast' # <<< 15794 1726882605.55180: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a77b90> <<< 15794 1726882605.55187: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.55268: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.55373: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15794 1726882605.55400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15794 1726882605.55492: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15794 1726882605.55770: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a81f40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a828a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a76810> # zipimport: zlib available # zipimport: zlib available <<< 15794 1726882605.55795: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15794 1726882605.55807: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.55859: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.55898: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.55962: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.56037: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15794 1726882605.56081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.56205: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a814f0> <<< 15794 1726882605.56307: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a829f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15794 1726882605.56342: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.56409: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.56444: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.56488: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15794 1726882605.56536: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15794 1726882605.56551: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15794 1726882605.56565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15794 1726882605.56653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15794 1726882605.56656: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15794 1726882605.56669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15794 1726882605.56746: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98912c00> <<< 15794 1726882605.56778: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b9888c9e0> <<< 15794 1726882605.56862: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98886a20> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98886810> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15794 1726882605.56906: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.56909: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.56965: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15794 1726882605.57037: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 15794 1726882605.57076: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15794 1726882605.57203: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.57426: stdout chunk (state=3): >>># zipimport: zlib available <<< 15794 1726882605.57585: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 15794 1726882605.57992: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 15794 1726882605.57997: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 15794 1726882605.58040: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect <<< 15794 1726882605.58043: stdout chunk (state=3): >>># destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 <<< 15794 1726882605.58340: stdout chunk (state=3): >>># cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15794 1726882605.58474: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 15794 1726882605.58498: stdout chunk (state=3): >>># destroy importlib <<< 15794 1726882605.58517: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 15794 1726882605.58545: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select <<< 15794 1726882605.58580: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15794 1726882605.58607: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 15794 1726882605.58633: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 15794 1726882605.58658: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 15794 1726882605.58701: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 15794 1726882605.58733: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 15794 1726882605.58762: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 15794 1726882605.58799: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 15794 1726882605.58838: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre <<< 15794 1726882605.58888: stdout chunk (state=3): >>># cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 15794 1726882605.58928: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15794 1726882605.59082: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15794 1726882605.59089: stdout chunk (state=3): >>># destroy _collections <<< 15794 1726882605.59113: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15794 1726882605.59181: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 15794 1726882605.59207: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15794 1726882605.59225: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15794 1726882605.59337: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 15794 1726882605.59459: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15794 1726882605.59882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882605.60140: stderr chunk (state=3): >>><<< 15794 1726882605.60143: stdout chunk (state=3): >>><<< 15794 1726882605.60181: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98f2c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98efbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98f2eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d41160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d41fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98db7860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98db7ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d97b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d951f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98ddb800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98dda420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98dd8c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7c2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e0ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e0cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98d7ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0d640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0d310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0e540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e24770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e25eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e26d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e273b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e262d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98e27e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e27560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0e5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98bbfcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98be8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98be84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98be8770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98be8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98bbde50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98bea030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98be8cb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98e0ec90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c1a390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c32540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c6b2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c91a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c6b3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c331d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98aac2f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98c31580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98beaf30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6b98aac500> # zipimport: found 30 names in '/tmp/ansible_stat_payload_q5q4aeyj/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b01f70> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98ad8e60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98aaffb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98adbe00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98b299d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b29760> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b29070> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b294c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b02c00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98b2a7b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98b2a9f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98b2af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b9898cc20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b9898e840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b9898f1d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989903b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98992e40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989931a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98991100> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98996e40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98995910> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98995670> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98997f50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98991610> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989deff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989df110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989e0cb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989e0a70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989e3230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989e13a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989eaa50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989e33e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989eb860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989eba70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989ebe00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989df3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989eec90> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989f01d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989ed430> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b989ee7b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989ecfe0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a74410> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a751f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b989f3890> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a74fb0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a77b90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a81f40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a828a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a76810> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6b98a814f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98a829f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98912c00> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b9888c9e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98886a20> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6b98886810> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15794 1726882605.61698: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882605.61702: _low_level_execute_command(): starting 15794 1726882605.61705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882604.9633741-15874-111163882840166/ > /dev/null 2>&1 && sleep 0' 15794 1726882605.61766: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882605.61783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882605.61802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882605.61823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882605.61941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882605.61955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882605.61977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882605.62076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882605.64012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882605.64084: stderr chunk (state=3): >>><<< 15794 1726882605.64094: stdout chunk (state=3): >>><<< 15794 1726882605.64115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882605.64131: handler run complete 15794 1726882605.64161: attempt loop complete, returning result 15794 1726882605.64168: _execute() done 15794 1726882605.64174: dumping result to json 15794 1726882605.64340: done dumping result, returning 15794 1726882605.64343: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affe814-3a2d-94e5-e48f-00000000008f] 15794 1726882605.64346: sending task result for task 0affe814-3a2d-94e5-e48f-00000000008f 15794 1726882605.64420: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000008f 15794 1726882605.64424: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15794 1726882605.64508: no more pending results, returning what we have 15794 1726882605.64512: results queue empty 15794 1726882605.64513: checking for any_errors_fatal 15794 1726882605.64525: done checking for any_errors_fatal 15794 1726882605.64526: checking for max_fail_percentage 15794 1726882605.64528: done checking for max_fail_percentage 15794 1726882605.64529: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.64530: done checking to see if all hosts have failed 15794 1726882605.64531: getting the remaining hosts for this loop 15794 1726882605.64539: done getting the remaining hosts for this loop 15794 1726882605.64545: getting the next task for host managed_node1 15794 1726882605.64554: done getting next task for host managed_node1 15794 1726882605.64557: ^ task is: TASK: Set flag to indicate system is ostree 15794 1726882605.64561: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.64565: getting variables 15794 1726882605.64568: in VariableManager get_vars() 15794 1726882605.64605: Calling all_inventory to load vars for managed_node1 15794 1726882605.64608: Calling groups_inventory to load vars for managed_node1 15794 1726882605.64613: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.64625: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.64629: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.64633: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.65224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.65967: done with get_vars() 15794 1726882605.65979: done getting variables 15794 1726882605.66205: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:36:45 -0400 (0:00:00.806) 0:00:03.221 ****** 15794 1726882605.66350: entering _queue_task() for managed_node1/set_fact 15794 1726882605.66352: Creating lock for set_fact 15794 1726882605.66851: worker is 1 (out of 1 available) 15794 1726882605.66863: exiting _queue_task() for managed_node1/set_fact 15794 1726882605.66876: done queuing things up, now waiting for results queue to drain 15794 1726882605.66878: waiting for pending results... 15794 1726882605.67286: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 15794 1726882605.67317: in run() - task 0affe814-3a2d-94e5-e48f-000000000090 15794 1726882605.67338: variable 'ansible_search_path' from source: unknown 15794 1726882605.67349: variable 'ansible_search_path' from source: unknown 15794 1726882605.67401: calling self._execute() 15794 1726882605.67511: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.67524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.67543: variable 'omit' from source: magic vars 15794 1726882605.68125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882605.68430: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882605.68539: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882605.68543: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882605.68592: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882605.68725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882605.68765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882605.68811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882605.68852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882605.69015: Evaluated conditional (not __network_is_ostree is defined): True 15794 1726882605.69019: variable 'omit' from source: magic vars 15794 1726882605.69066: variable 'omit' from source: magic vars 15794 1726882605.69235: variable '__ostree_booted_stat' from source: set_fact 15794 1726882605.69344: variable 'omit' from source: magic vars 15794 1726882605.69347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882605.69375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882605.69400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882605.69426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882605.69450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882605.69494: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882605.69503: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.69561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.69644: Set connection var ansible_connection to ssh 15794 1726882605.69660: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882605.69682: Set connection var ansible_pipelining to False 15794 1726882605.69696: Set connection var ansible_shell_executable to /bin/sh 15794 1726882605.69705: Set connection var ansible_shell_type to sh 15794 1726882605.69720: Set connection var ansible_timeout to 10 15794 1726882605.69759: variable 'ansible_shell_executable' from source: unknown 15794 1726882605.69769: variable 'ansible_connection' from source: unknown 15794 1726882605.69788: variable 'ansible_module_compression' from source: unknown 15794 1726882605.69885: variable 'ansible_shell_type' from source: unknown 15794 1726882605.69889: variable 'ansible_shell_executable' from source: unknown 15794 1726882605.69893: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.69896: variable 'ansible_pipelining' from source: unknown 15794 1726882605.69898: variable 'ansible_timeout' from source: unknown 15794 1726882605.69900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.69968: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882605.69986: variable 'omit' from source: magic vars 15794 1726882605.69998: starting attempt loop 15794 1726882605.70007: running the handler 15794 1726882605.70035: handler run complete 15794 1726882605.70053: attempt loop complete, returning result 15794 1726882605.70127: _execute() done 15794 1726882605.70131: dumping result to json 15794 1726882605.70133: done dumping result, returning 15794 1726882605.70137: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affe814-3a2d-94e5-e48f-000000000090] 15794 1726882605.70139: sending task result for task 0affe814-3a2d-94e5-e48f-000000000090 15794 1726882605.70205: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000090 15794 1726882605.70208: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15794 1726882605.70290: no more pending results, returning what we have 15794 1726882605.70294: results queue empty 15794 1726882605.70295: checking for any_errors_fatal 15794 1726882605.70304: done checking for any_errors_fatal 15794 1726882605.70305: checking for max_fail_percentage 15794 1726882605.70307: done checking for max_fail_percentage 15794 1726882605.70308: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.70309: done checking to see if all hosts have failed 15794 1726882605.70310: getting the remaining hosts for this loop 15794 1726882605.70312: done getting the remaining hosts for this loop 15794 1726882605.70316: getting the next task for host managed_node1 15794 1726882605.70327: done getting next task for host managed_node1 15794 1726882605.70330: ^ task is: TASK: Fix CentOS6 Base repo 15794 1726882605.70333: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.70339: getting variables 15794 1726882605.70341: in VariableManager get_vars() 15794 1726882605.70373: Calling all_inventory to load vars for managed_node1 15794 1726882605.70376: Calling groups_inventory to load vars for managed_node1 15794 1726882605.70380: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.70392: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.70395: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.70405: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.70830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.71139: done with get_vars() 15794 1726882605.71151: done getting variables 15794 1726882605.71282: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:36:45 -0400 (0:00:00.049) 0:00:03.271 ****** 15794 1726882605.71315: entering _queue_task() for managed_node1/copy 15794 1726882605.71562: worker is 1 (out of 1 available) 15794 1726882605.71574: exiting _queue_task() for managed_node1/copy 15794 1726882605.71588: done queuing things up, now waiting for results queue to drain 15794 1726882605.71589: waiting for pending results... 15794 1726882605.71846: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 15794 1726882605.72039: in run() - task 0affe814-3a2d-94e5-e48f-000000000092 15794 1726882605.72043: variable 'ansible_search_path' from source: unknown 15794 1726882605.72047: variable 'ansible_search_path' from source: unknown 15794 1726882605.72050: calling self._execute() 15794 1726882605.72116: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.72130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.72150: variable 'omit' from source: magic vars 15794 1726882605.72760: variable 'ansible_distribution' from source: facts 15794 1726882605.72820: Evaluated conditional (ansible_distribution == 'CentOS'): False 15794 1726882605.72823: when evaluation is False, skipping this task 15794 1726882605.72827: _execute() done 15794 1726882605.72829: dumping result to json 15794 1726882605.72832: done dumping result, returning 15794 1726882605.72836: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affe814-3a2d-94e5-e48f-000000000092] 15794 1726882605.72842: sending task result for task 0affe814-3a2d-94e5-e48f-000000000092 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 15794 1726882605.73018: no more pending results, returning what we have 15794 1726882605.73022: results queue empty 15794 1726882605.73023: checking for any_errors_fatal 15794 1726882605.73028: done checking for any_errors_fatal 15794 1726882605.73029: checking for max_fail_percentage 15794 1726882605.73031: done checking for max_fail_percentage 15794 1726882605.73032: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.73033: done checking to see if all hosts have failed 15794 1726882605.73036: getting the remaining hosts for this loop 15794 1726882605.73038: done getting the remaining hosts for this loop 15794 1726882605.73043: getting the next task for host managed_node1 15794 1726882605.73052: done getting next task for host managed_node1 15794 1726882605.73055: ^ task is: TASK: Include the task 'enable_epel.yml' 15794 1726882605.73059: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.73064: getting variables 15794 1726882605.73066: in VariableManager get_vars() 15794 1726882605.73096: Calling all_inventory to load vars for managed_node1 15794 1726882605.73099: Calling groups_inventory to load vars for managed_node1 15794 1726882605.73104: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.73117: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.73122: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.73126: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.73331: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000092 15794 1726882605.73336: WORKER PROCESS EXITING 15794 1726882605.73581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.73847: done with get_vars() 15794 1726882605.73857: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:36:45 -0400 (0:00:00.026) 0:00:03.297 ****** 15794 1726882605.73956: entering _queue_task() for managed_node1/include_tasks 15794 1726882605.74182: worker is 1 (out of 1 available) 15794 1726882605.74195: exiting _queue_task() for managed_node1/include_tasks 15794 1726882605.74206: done queuing things up, now waiting for results queue to drain 15794 1726882605.74208: waiting for pending results... 15794 1726882605.74450: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 15794 1726882605.74641: in run() - task 0affe814-3a2d-94e5-e48f-000000000093 15794 1726882605.74644: variable 'ansible_search_path' from source: unknown 15794 1726882605.74647: variable 'ansible_search_path' from source: unknown 15794 1726882605.74650: calling self._execute() 15794 1726882605.74720: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.74736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.74754: variable 'omit' from source: magic vars 15794 1726882605.75328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882605.77903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882605.77986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882605.78043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882605.78105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882605.78143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882605.78324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882605.78328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882605.78331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882605.78378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882605.78400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882605.78542: variable '__network_is_ostree' from source: set_fact 15794 1726882605.78567: Evaluated conditional (not __network_is_ostree | d(false)): True 15794 1726882605.78580: _execute() done 15794 1726882605.78589: dumping result to json 15794 1726882605.78599: done dumping result, returning 15794 1726882605.78611: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affe814-3a2d-94e5-e48f-000000000093] 15794 1726882605.78624: sending task result for task 0affe814-3a2d-94e5-e48f-000000000093 15794 1726882605.78785: no more pending results, returning what we have 15794 1726882605.78791: in VariableManager get_vars() 15794 1726882605.78829: Calling all_inventory to load vars for managed_node1 15794 1726882605.78832: Calling groups_inventory to load vars for managed_node1 15794 1726882605.78839: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.78852: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.78856: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.78860: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.79323: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000093 15794 1726882605.79327: WORKER PROCESS EXITING 15794 1726882605.79355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.79624: done with get_vars() 15794 1726882605.79635: variable 'ansible_search_path' from source: unknown 15794 1726882605.79636: variable 'ansible_search_path' from source: unknown 15794 1726882605.79680: we have included files to process 15794 1726882605.79682: generating all_blocks data 15794 1726882605.79684: done generating all_blocks data 15794 1726882605.79691: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15794 1726882605.79693: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15794 1726882605.79696: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15794 1726882605.80602: done processing included file 15794 1726882605.80605: iterating over new_blocks loaded from include file 15794 1726882605.80606: in VariableManager get_vars() 15794 1726882605.80620: done with get_vars() 15794 1726882605.80622: filtering new block on tags 15794 1726882605.80654: done filtering new block on tags 15794 1726882605.80657: in VariableManager get_vars() 15794 1726882605.80671: done with get_vars() 15794 1726882605.80673: filtering new block on tags 15794 1726882605.80688: done filtering new block on tags 15794 1726882605.80690: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 15794 1726882605.80696: extending task lists for all hosts with included blocks 15794 1726882605.80837: done extending task lists 15794 1726882605.80839: done processing included files 15794 1726882605.80840: results queue empty 15794 1726882605.80841: checking for any_errors_fatal 15794 1726882605.80845: done checking for any_errors_fatal 15794 1726882605.80846: checking for max_fail_percentage 15794 1726882605.80848: done checking for max_fail_percentage 15794 1726882605.80848: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.80849: done checking to see if all hosts have failed 15794 1726882605.80850: getting the remaining hosts for this loop 15794 1726882605.80852: done getting the remaining hosts for this loop 15794 1726882605.80855: getting the next task for host managed_node1 15794 1726882605.80860: done getting next task for host managed_node1 15794 1726882605.80862: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15794 1726882605.80869: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.80871: getting variables 15794 1726882605.80872: in VariableManager get_vars() 15794 1726882605.80882: Calling all_inventory to load vars for managed_node1 15794 1726882605.80885: Calling groups_inventory to load vars for managed_node1 15794 1726882605.80889: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.80895: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.80903: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.80908: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.81072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.81237: done with get_vars() 15794 1726882605.81244: done getting variables 15794 1726882605.81298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15794 1726882605.81461: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:36:45 -0400 (0:00:00.075) 0:00:03.372 ****** 15794 1726882605.81498: entering _queue_task() for managed_node1/command 15794 1726882605.81499: Creating lock for command 15794 1726882605.81705: worker is 1 (out of 1 available) 15794 1726882605.81718: exiting _queue_task() for managed_node1/command 15794 1726882605.81732: done queuing things up, now waiting for results queue to drain 15794 1726882605.81735: waiting for pending results... 15794 1726882605.81902: running TaskExecutor() for managed_node1/TASK: Create EPEL 39 15794 1726882605.81981: in run() - task 0affe814-3a2d-94e5-e48f-0000000000ad 15794 1726882605.81995: variable 'ansible_search_path' from source: unknown 15794 1726882605.81999: variable 'ansible_search_path' from source: unknown 15794 1726882605.82030: calling self._execute() 15794 1726882605.82107: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.82113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.82123: variable 'omit' from source: magic vars 15794 1726882605.82428: variable 'ansible_distribution' from source: facts 15794 1726882605.82438: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15794 1726882605.82442: when evaluation is False, skipping this task 15794 1726882605.82446: _execute() done 15794 1726882605.82448: dumping result to json 15794 1726882605.82454: done dumping result, returning 15794 1726882605.82460: done running TaskExecutor() for managed_node1/TASK: Create EPEL 39 [0affe814-3a2d-94e5-e48f-0000000000ad] 15794 1726882605.82467: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000ad 15794 1726882605.82575: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000ad 15794 1726882605.82578: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15794 1726882605.82636: no more pending results, returning what we have 15794 1726882605.82639: results queue empty 15794 1726882605.82640: checking for any_errors_fatal 15794 1726882605.82642: done checking for any_errors_fatal 15794 1726882605.82643: checking for max_fail_percentage 15794 1726882605.82644: done checking for max_fail_percentage 15794 1726882605.82645: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.82646: done checking to see if all hosts have failed 15794 1726882605.82647: getting the remaining hosts for this loop 15794 1726882605.82648: done getting the remaining hosts for this loop 15794 1726882605.82652: getting the next task for host managed_node1 15794 1726882605.82658: done getting next task for host managed_node1 15794 1726882605.82661: ^ task is: TASK: Install yum-utils package 15794 1726882605.82664: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.82667: getting variables 15794 1726882605.82669: in VariableManager get_vars() 15794 1726882605.82696: Calling all_inventory to load vars for managed_node1 15794 1726882605.82699: Calling groups_inventory to load vars for managed_node1 15794 1726882605.82703: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.82710: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.82713: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.82715: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.82847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.83058: done with get_vars() 15794 1726882605.83069: done getting variables 15794 1726882605.83166: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:36:45 -0400 (0:00:00.016) 0:00:03.389 ****** 15794 1726882605.83198: entering _queue_task() for managed_node1/package 15794 1726882605.83201: Creating lock for package 15794 1726882605.83442: worker is 1 (out of 1 available) 15794 1726882605.83456: exiting _queue_task() for managed_node1/package 15794 1726882605.83471: done queuing things up, now waiting for results queue to drain 15794 1726882605.83473: waiting for pending results... 15794 1726882605.83853: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 15794 1726882605.83865: in run() - task 0affe814-3a2d-94e5-e48f-0000000000ae 15794 1726882605.83885: variable 'ansible_search_path' from source: unknown 15794 1726882605.83894: variable 'ansible_search_path' from source: unknown 15794 1726882605.83940: calling self._execute() 15794 1726882605.84030: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.84047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.84066: variable 'omit' from source: magic vars 15794 1726882605.84431: variable 'ansible_distribution' from source: facts 15794 1726882605.84449: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15794 1726882605.84453: when evaluation is False, skipping this task 15794 1726882605.84456: _execute() done 15794 1726882605.84459: dumping result to json 15794 1726882605.84463: done dumping result, returning 15794 1726882605.84470: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affe814-3a2d-94e5-e48f-0000000000ae] 15794 1726882605.84476: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000ae 15794 1726882605.84575: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000ae 15794 1726882605.84578: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15794 1726882605.84642: no more pending results, returning what we have 15794 1726882605.84645: results queue empty 15794 1726882605.84646: checking for any_errors_fatal 15794 1726882605.84651: done checking for any_errors_fatal 15794 1726882605.84652: checking for max_fail_percentage 15794 1726882605.84654: done checking for max_fail_percentage 15794 1726882605.84655: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.84655: done checking to see if all hosts have failed 15794 1726882605.84656: getting the remaining hosts for this loop 15794 1726882605.84658: done getting the remaining hosts for this loop 15794 1726882605.84661: getting the next task for host managed_node1 15794 1726882605.84667: done getting next task for host managed_node1 15794 1726882605.84670: ^ task is: TASK: Enable EPEL 7 15794 1726882605.84674: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.84677: getting variables 15794 1726882605.84678: in VariableManager get_vars() 15794 1726882605.84703: Calling all_inventory to load vars for managed_node1 15794 1726882605.84705: Calling groups_inventory to load vars for managed_node1 15794 1726882605.84707: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.84715: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.84717: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.84720: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.84848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.85001: done with get_vars() 15794 1726882605.85008: done getting variables 15794 1726882605.85054: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:36:45 -0400 (0:00:00.018) 0:00:03.408 ****** 15794 1726882605.85075: entering _queue_task() for managed_node1/command 15794 1726882605.85245: worker is 1 (out of 1 available) 15794 1726882605.85258: exiting _queue_task() for managed_node1/command 15794 1726882605.85270: done queuing things up, now waiting for results queue to drain 15794 1726882605.85271: waiting for pending results... 15794 1726882605.85413: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 15794 1726882605.85486: in run() - task 0affe814-3a2d-94e5-e48f-0000000000af 15794 1726882605.85499: variable 'ansible_search_path' from source: unknown 15794 1726882605.85504: variable 'ansible_search_path' from source: unknown 15794 1726882605.85531: calling self._execute() 15794 1726882605.85592: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.85598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.85609: variable 'omit' from source: magic vars 15794 1726882605.85895: variable 'ansible_distribution' from source: facts 15794 1726882605.85906: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15794 1726882605.85909: when evaluation is False, skipping this task 15794 1726882605.85912: _execute() done 15794 1726882605.85915: dumping result to json 15794 1726882605.85920: done dumping result, returning 15794 1726882605.85927: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affe814-3a2d-94e5-e48f-0000000000af] 15794 1726882605.85935: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000af 15794 1726882605.86025: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000af 15794 1726882605.86028: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15794 1726882605.86086: no more pending results, returning what we have 15794 1726882605.86090: results queue empty 15794 1726882605.86091: checking for any_errors_fatal 15794 1726882605.86097: done checking for any_errors_fatal 15794 1726882605.86098: checking for max_fail_percentage 15794 1726882605.86100: done checking for max_fail_percentage 15794 1726882605.86101: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.86102: done checking to see if all hosts have failed 15794 1726882605.86103: getting the remaining hosts for this loop 15794 1726882605.86104: done getting the remaining hosts for this loop 15794 1726882605.86108: getting the next task for host managed_node1 15794 1726882605.86114: done getting next task for host managed_node1 15794 1726882605.86116: ^ task is: TASK: Enable EPEL 8 15794 1726882605.86120: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.86124: getting variables 15794 1726882605.86125: in VariableManager get_vars() 15794 1726882605.86148: Calling all_inventory to load vars for managed_node1 15794 1726882605.86150: Calling groups_inventory to load vars for managed_node1 15794 1726882605.86153: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.86159: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.86162: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.86164: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.86317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.86510: done with get_vars() 15794 1726882605.86520: done getting variables 15794 1726882605.86579: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:36:45 -0400 (0:00:00.015) 0:00:03.424 ****** 15794 1726882605.86609: entering _queue_task() for managed_node1/command 15794 1726882605.86823: worker is 1 (out of 1 available) 15794 1726882605.87036: exiting _queue_task() for managed_node1/command 15794 1726882605.87050: done queuing things up, now waiting for results queue to drain 15794 1726882605.87051: waiting for pending results... 15794 1726882605.87181: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 15794 1726882605.87239: in run() - task 0affe814-3a2d-94e5-e48f-0000000000b0 15794 1726882605.87261: variable 'ansible_search_path' from source: unknown 15794 1726882605.87273: variable 'ansible_search_path' from source: unknown 15794 1726882605.87323: calling self._execute() 15794 1726882605.87421: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.87440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.87459: variable 'omit' from source: magic vars 15794 1726882605.87883: variable 'ansible_distribution' from source: facts 15794 1726882605.87903: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15794 1726882605.87934: when evaluation is False, skipping this task 15794 1726882605.87938: _execute() done 15794 1726882605.87941: dumping result to json 15794 1726882605.87943: done dumping result, returning 15794 1726882605.87947: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affe814-3a2d-94e5-e48f-0000000000b0] 15794 1726882605.88041: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000b0 15794 1726882605.88114: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000b0 15794 1726882605.88117: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15794 1726882605.88195: no more pending results, returning what we have 15794 1726882605.88199: results queue empty 15794 1726882605.88200: checking for any_errors_fatal 15794 1726882605.88204: done checking for any_errors_fatal 15794 1726882605.88205: checking for max_fail_percentage 15794 1726882605.88207: done checking for max_fail_percentage 15794 1726882605.88208: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.88209: done checking to see if all hosts have failed 15794 1726882605.88209: getting the remaining hosts for this loop 15794 1726882605.88211: done getting the remaining hosts for this loop 15794 1726882605.88214: getting the next task for host managed_node1 15794 1726882605.88222: done getting next task for host managed_node1 15794 1726882605.88225: ^ task is: TASK: Enable EPEL 6 15794 1726882605.88229: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.88232: getting variables 15794 1726882605.88235: in VariableManager get_vars() 15794 1726882605.88264: Calling all_inventory to load vars for managed_node1 15794 1726882605.88268: Calling groups_inventory to load vars for managed_node1 15794 1726882605.88272: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.88281: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.88284: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.88286: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.88416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.88581: done with get_vars() 15794 1726882605.88589: done getting variables 15794 1726882605.88633: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:36:45 -0400 (0:00:00.020) 0:00:03.444 ****** 15794 1726882605.88657: entering _queue_task() for managed_node1/copy 15794 1726882605.88827: worker is 1 (out of 1 available) 15794 1726882605.88843: exiting _queue_task() for managed_node1/copy 15794 1726882605.88856: done queuing things up, now waiting for results queue to drain 15794 1726882605.88857: waiting for pending results... 15794 1726882605.88996: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 15794 1726882605.89068: in run() - task 0affe814-3a2d-94e5-e48f-0000000000b2 15794 1726882605.89080: variable 'ansible_search_path' from source: unknown 15794 1726882605.89084: variable 'ansible_search_path' from source: unknown 15794 1726882605.89116: calling self._execute() 15794 1726882605.89176: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.89183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.89195: variable 'omit' from source: magic vars 15794 1726882605.89480: variable 'ansible_distribution' from source: facts 15794 1726882605.89492: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15794 1726882605.89495: when evaluation is False, skipping this task 15794 1726882605.89498: _execute() done 15794 1726882605.89502: dumping result to json 15794 1726882605.89507: done dumping result, returning 15794 1726882605.89513: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affe814-3a2d-94e5-e48f-0000000000b2] 15794 1726882605.89519: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000b2 15794 1726882605.89613: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000b2 15794 1726882605.89617: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15794 1726882605.89673: no more pending results, returning what we have 15794 1726882605.89676: results queue empty 15794 1726882605.89677: checking for any_errors_fatal 15794 1726882605.89681: done checking for any_errors_fatal 15794 1726882605.89682: checking for max_fail_percentage 15794 1726882605.89684: done checking for max_fail_percentage 15794 1726882605.89685: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.89686: done checking to see if all hosts have failed 15794 1726882605.89687: getting the remaining hosts for this loop 15794 1726882605.89689: done getting the remaining hosts for this loop 15794 1726882605.89692: getting the next task for host managed_node1 15794 1726882605.89700: done getting next task for host managed_node1 15794 1726882605.89703: ^ task is: TASK: Set network provider to 'nm' 15794 1726882605.89705: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.89708: getting variables 15794 1726882605.89710: in VariableManager get_vars() 15794 1726882605.89733: Calling all_inventory to load vars for managed_node1 15794 1726882605.89736: Calling groups_inventory to load vars for managed_node1 15794 1726882605.89741: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.89748: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.89750: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.89752: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.89905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.90096: done with get_vars() 15794 1726882605.90105: done getting variables 15794 1726882605.90150: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 21:36:45 -0400 (0:00:00.015) 0:00:03.459 ****** 15794 1726882605.90170: entering _queue_task() for managed_node1/set_fact 15794 1726882605.90367: worker is 1 (out of 1 available) 15794 1726882605.90382: exiting _queue_task() for managed_node1/set_fact 15794 1726882605.90394: done queuing things up, now waiting for results queue to drain 15794 1726882605.90396: waiting for pending results... 15794 1726882605.90654: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 15794 1726882605.90676: in run() - task 0affe814-3a2d-94e5-e48f-000000000007 15794 1726882605.90756: variable 'ansible_search_path' from source: unknown 15794 1726882605.90760: calling self._execute() 15794 1726882605.90821: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.90829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.90845: variable 'omit' from source: magic vars 15794 1726882605.90969: variable 'omit' from source: magic vars 15794 1726882605.91012: variable 'omit' from source: magic vars 15794 1726882605.91054: variable 'omit' from source: magic vars 15794 1726882605.91187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882605.91192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882605.91195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882605.91198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882605.91208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882605.91242: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882605.91246: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.91252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.91377: Set connection var ansible_connection to ssh 15794 1726882605.91391: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882605.91404: Set connection var ansible_pipelining to False 15794 1726882605.91417: Set connection var ansible_shell_executable to /bin/sh 15794 1726882605.91424: Set connection var ansible_shell_type to sh 15794 1726882605.91441: Set connection var ansible_timeout to 10 15794 1726882605.91477: variable 'ansible_shell_executable' from source: unknown 15794 1726882605.91485: variable 'ansible_connection' from source: unknown 15794 1726882605.91512: variable 'ansible_module_compression' from source: unknown 15794 1726882605.91515: variable 'ansible_shell_type' from source: unknown 15794 1726882605.91520: variable 'ansible_shell_executable' from source: unknown 15794 1726882605.91523: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.91525: variable 'ansible_pipelining' from source: unknown 15794 1726882605.91627: variable 'ansible_timeout' from source: unknown 15794 1726882605.91631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.91717: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882605.91743: variable 'omit' from source: magic vars 15794 1726882605.91755: starting attempt loop 15794 1726882605.91763: running the handler 15794 1726882605.91783: handler run complete 15794 1726882605.91798: attempt loop complete, returning result 15794 1726882605.91805: _execute() done 15794 1726882605.91811: dumping result to json 15794 1726882605.91840: done dumping result, returning 15794 1726882605.91846: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affe814-3a2d-94e5-e48f-000000000007] 15794 1726882605.91852: sending task result for task 0affe814-3a2d-94e5-e48f-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15794 1726882605.92170: no more pending results, returning what we have 15794 1726882605.92173: results queue empty 15794 1726882605.92174: checking for any_errors_fatal 15794 1726882605.92179: done checking for any_errors_fatal 15794 1726882605.92180: checking for max_fail_percentage 15794 1726882605.92182: done checking for max_fail_percentage 15794 1726882605.92183: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.92184: done checking to see if all hosts have failed 15794 1726882605.92185: getting the remaining hosts for this loop 15794 1726882605.92186: done getting the remaining hosts for this loop 15794 1726882605.92189: getting the next task for host managed_node1 15794 1726882605.92195: done getting next task for host managed_node1 15794 1726882605.92198: ^ task is: TASK: meta (flush_handlers) 15794 1726882605.92200: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.92204: getting variables 15794 1726882605.92205: in VariableManager get_vars() 15794 1726882605.92229: Calling all_inventory to load vars for managed_node1 15794 1726882605.92232: Calling groups_inventory to load vars for managed_node1 15794 1726882605.92237: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.92245: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.92248: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.92251: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.92495: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000007 15794 1726882605.92499: WORKER PROCESS EXITING 15794 1726882605.92517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.92688: done with get_vars() 15794 1726882605.92698: done getting variables 15794 1726882605.92774: in VariableManager get_vars() 15794 1726882605.92784: Calling all_inventory to load vars for managed_node1 15794 1726882605.92787: Calling groups_inventory to load vars for managed_node1 15794 1726882605.92790: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.92794: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.92797: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.92801: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.93109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.93299: done with get_vars() 15794 1726882605.93314: done queuing things up, now waiting for results queue to drain 15794 1726882605.93316: results queue empty 15794 1726882605.93317: checking for any_errors_fatal 15794 1726882605.93319: done checking for any_errors_fatal 15794 1726882605.93320: checking for max_fail_percentage 15794 1726882605.93322: done checking for max_fail_percentage 15794 1726882605.93322: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.93323: done checking to see if all hosts have failed 15794 1726882605.93324: getting the remaining hosts for this loop 15794 1726882605.93326: done getting the remaining hosts for this loop 15794 1726882605.93328: getting the next task for host managed_node1 15794 1726882605.93332: done getting next task for host managed_node1 15794 1726882605.93336: ^ task is: TASK: meta (flush_handlers) 15794 1726882605.93337: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.93344: getting variables 15794 1726882605.93345: in VariableManager get_vars() 15794 1726882605.93354: Calling all_inventory to load vars for managed_node1 15794 1726882605.93356: Calling groups_inventory to load vars for managed_node1 15794 1726882605.93359: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.93364: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.93367: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.93371: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.93549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.93804: done with get_vars() 15794 1726882605.93813: done getting variables 15794 1726882605.93865: in VariableManager get_vars() 15794 1726882605.93874: Calling all_inventory to load vars for managed_node1 15794 1726882605.93877: Calling groups_inventory to load vars for managed_node1 15794 1726882605.93880: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.93885: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.93888: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.93891: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.94093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.94337: done with get_vars() 15794 1726882605.94346: done queuing things up, now waiting for results queue to drain 15794 1726882605.94347: results queue empty 15794 1726882605.94348: checking for any_errors_fatal 15794 1726882605.94349: done checking for any_errors_fatal 15794 1726882605.94349: checking for max_fail_percentage 15794 1726882605.94350: done checking for max_fail_percentage 15794 1726882605.94351: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.94351: done checking to see if all hosts have failed 15794 1726882605.94352: getting the remaining hosts for this loop 15794 1726882605.94352: done getting the remaining hosts for this loop 15794 1726882605.94354: getting the next task for host managed_node1 15794 1726882605.94356: done getting next task for host managed_node1 15794 1726882605.94357: ^ task is: None 15794 1726882605.94358: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.94359: done queuing things up, now waiting for results queue to drain 15794 1726882605.94360: results queue empty 15794 1726882605.94360: checking for any_errors_fatal 15794 1726882605.94361: done checking for any_errors_fatal 15794 1726882605.94361: checking for max_fail_percentage 15794 1726882605.94362: done checking for max_fail_percentage 15794 1726882605.94362: checking to see if all hosts have failed and the running result is not ok 15794 1726882605.94363: done checking to see if all hosts have failed 15794 1726882605.94364: getting the next task for host managed_node1 15794 1726882605.94366: done getting next task for host managed_node1 15794 1726882605.94366: ^ task is: None 15794 1726882605.94367: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.94399: in VariableManager get_vars() 15794 1726882605.94419: done with get_vars() 15794 1726882605.94428: in VariableManager get_vars() 15794 1726882605.94442: done with get_vars() 15794 1726882605.94448: variable 'omit' from source: magic vars 15794 1726882605.94472: in VariableManager get_vars() 15794 1726882605.94479: done with get_vars() 15794 1726882605.94495: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 15794 1726882605.94636: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882605.94658: getting the remaining hosts for this loop 15794 1726882605.94660: done getting the remaining hosts for this loop 15794 1726882605.94662: getting the next task for host managed_node1 15794 1726882605.94664: done getting next task for host managed_node1 15794 1726882605.94665: ^ task is: TASK: Gathering Facts 15794 1726882605.94666: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882605.94668: getting variables 15794 1726882605.94669: in VariableManager get_vars() 15794 1726882605.94674: Calling all_inventory to load vars for managed_node1 15794 1726882605.94676: Calling groups_inventory to load vars for managed_node1 15794 1726882605.94678: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882605.94682: Calling all_plugins_play to load vars for managed_node1 15794 1726882605.94693: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882605.94695: Calling groups_plugins_play to load vars for managed_node1 15794 1726882605.94802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882605.94946: done with get_vars() 15794 1726882605.94952: done getting variables 15794 1726882605.94983: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 21:36:45 -0400 (0:00:00.048) 0:00:03.507 ****** 15794 1726882605.95000: entering _queue_task() for managed_node1/gather_facts 15794 1726882605.95151: worker is 1 (out of 1 available) 15794 1726882605.95163: exiting _queue_task() for managed_node1/gather_facts 15794 1726882605.95173: done queuing things up, now waiting for results queue to drain 15794 1726882605.95175: waiting for pending results... 15794 1726882605.95321: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882605.95395: in run() - task 0affe814-3a2d-94e5-e48f-0000000000d8 15794 1726882605.95407: variable 'ansible_search_path' from source: unknown 15794 1726882605.95444: calling self._execute() 15794 1726882605.95498: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.95506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.95515: variable 'omit' from source: magic vars 15794 1726882605.95796: variable 'ansible_distribution_major_version' from source: facts 15794 1726882605.95807: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882605.95812: variable 'omit' from source: magic vars 15794 1726882605.95836: variable 'omit' from source: magic vars 15794 1726882605.95868: variable 'omit' from source: magic vars 15794 1726882605.95899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882605.95927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882605.95947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882605.95963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882605.95974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882605.96002: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882605.96005: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.96010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.96088: Set connection var ansible_connection to ssh 15794 1726882605.96098: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882605.96109: Set connection var ansible_pipelining to False 15794 1726882605.96116: Set connection var ansible_shell_executable to /bin/sh 15794 1726882605.96119: Set connection var ansible_shell_type to sh 15794 1726882605.96128: Set connection var ansible_timeout to 10 15794 1726882605.96153: variable 'ansible_shell_executable' from source: unknown 15794 1726882605.96157: variable 'ansible_connection' from source: unknown 15794 1726882605.96160: variable 'ansible_module_compression' from source: unknown 15794 1726882605.96163: variable 'ansible_shell_type' from source: unknown 15794 1726882605.96166: variable 'ansible_shell_executable' from source: unknown 15794 1726882605.96171: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882605.96177: variable 'ansible_pipelining' from source: unknown 15794 1726882605.96184: variable 'ansible_timeout' from source: unknown 15794 1726882605.96189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882605.96528: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882605.96532: variable 'omit' from source: magic vars 15794 1726882605.96537: starting attempt loop 15794 1726882605.96540: running the handler 15794 1726882605.96542: variable 'ansible_facts' from source: unknown 15794 1726882605.96544: _low_level_execute_command(): starting 15794 1726882605.96547: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882605.97254: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882605.97258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882605.97260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882605.97308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882605.97346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882605.99090: stdout chunk (state=3): >>>/root <<< 15794 1726882605.99197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882605.99244: stderr chunk (state=3): >>><<< 15794 1726882605.99248: stdout chunk (state=3): >>><<< 15794 1726882605.99266: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882605.99281: _low_level_execute_command(): starting 15794 1726882605.99285: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368 `" && echo ansible-tmp-1726882605.992672-15918-112316066205368="` echo /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368 `" ) && sleep 0' 15794 1726882605.99708: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882605.99712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882605.99714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882605.99723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882605.99777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882605.99782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882605.99842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882606.01813: stdout chunk (state=3): >>>ansible-tmp-1726882605.992672-15918-112316066205368=/root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368 <<< 15794 1726882606.01937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882606.01982: stderr chunk (state=3): >>><<< 15794 1726882606.01986: stdout chunk (state=3): >>><<< 15794 1726882606.02001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882605.992672-15918-112316066205368=/root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882606.02025: variable 'ansible_module_compression' from source: unknown 15794 1726882606.02069: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882606.02123: variable 'ansible_facts' from source: unknown 15794 1726882606.02244: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py 15794 1726882606.02458: Sending initial data 15794 1726882606.02462: Sent initial data (153 bytes) 15794 1726882606.02829: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882606.02833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882606.02837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882606.02840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882606.02968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882606.03171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882606.04833: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882606.04897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882606.05158: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpdxatj82m /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py <<< 15794 1726882606.05162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py" <<< 15794 1726882606.05228: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpdxatj82m" to remote "/root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py" <<< 15794 1726882606.07866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882606.07941: stderr chunk (state=3): >>><<< 15794 1726882606.07944: stdout chunk (state=3): >>><<< 15794 1726882606.07961: done transferring module to remote 15794 1726882606.07971: _low_level_execute_command(): starting 15794 1726882606.07977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/ /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py && sleep 0' 15794 1726882606.08429: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882606.08432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882606.08435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882606.08444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882606.08447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882606.08498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882606.08501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882606.08562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882606.10542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882606.10545: stdout chunk (state=3): >>><<< 15794 1726882606.10548: stderr chunk (state=3): >>><<< 15794 1726882606.10565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882606.10654: _low_level_execute_command(): starting 15794 1726882606.10658: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/AnsiballZ_setup.py && sleep 0' 15794 1726882606.11197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882606.11213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882606.11251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882606.11262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882606.11305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882606.11383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882606.11414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882606.11442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882606.11496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882606.78740: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2871, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 846, "free": 2871}, "nocache": {"free": 3475, "used": 242}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_ve<<< 15794 1726882606.78789: stdout chunk (state=3): >>>rsion": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 560, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205451776, "block_size": 4096, "block_total": 64483404, "block_available": 61329456, "block_used": 3153948, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6<<< 15794 1726882606.78809: stdout chunk (state=3): >>>_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.4375, "5m": 0.4248046875, "15m": 0.20849609375}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "46", "epoch": "1726882606", "epoch_int": "1726882606", "date": "2024-09-20", "time": "21:36:46", "iso8601_micro": "2024-09-21T01:36:46.783492Z", "iso8601": "2024-09-21T01:36:46Z", "iso8601_basic": "20240920T213646783492", "iso8601_basic_short": "20240920T213646", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882606.80909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882606.80913: stdout chunk (state=3): >>><<< 15794 1726882606.80916: stderr chunk (state=3): >>><<< 15794 1726882606.81208: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2871, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 846, "free": 2871}, "nocache": {"free": 3475, "used": 242}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 560, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205451776, "block_size": 4096, "block_total": 64483404, "block_available": 61329456, "block_used": 3153948, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.4375, "5m": 0.4248046875, "15m": 0.20849609375}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "46", "epoch": "1726882606", "epoch_int": "1726882606", "date": "2024-09-20", "time": "21:36:46", "iso8601_micro": "2024-09-21T01:36:46.783492Z", "iso8601": "2024-09-21T01:36:46Z", "iso8601_basic": "20240920T213646783492", "iso8601_basic_short": "20240920T213646", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882606.81606: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882606.81664: _low_level_execute_command(): starting 15794 1726882606.81839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882605.992672-15918-112316066205368/ > /dev/null 2>&1 && sleep 0' 15794 1726882606.82857: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882606.82869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882606.82897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882606.82976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882606.85032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882606.85053: stdout chunk (state=3): >>><<< 15794 1726882606.85068: stderr chunk (state=3): >>><<< 15794 1726882606.85156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882606.85439: handler run complete 15794 1726882606.85682: variable 'ansible_facts' from source: unknown 15794 1726882606.85895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.86884: variable 'ansible_facts' from source: unknown 15794 1726882606.87161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.87582: attempt loop complete, returning result 15794 1726882606.87594: _execute() done 15794 1726882606.87771: dumping result to json 15794 1726882606.87774: done dumping result, returning 15794 1726882606.87777: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-0000000000d8] 15794 1726882606.87779: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000d8 15794 1726882606.88425: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000d8 15794 1726882606.88428: WORKER PROCESS EXITING ok: [managed_node1] 15794 1726882606.89164: no more pending results, returning what we have 15794 1726882606.89168: results queue empty 15794 1726882606.89169: checking for any_errors_fatal 15794 1726882606.89171: done checking for any_errors_fatal 15794 1726882606.89176: checking for max_fail_percentage 15794 1726882606.89178: done checking for max_fail_percentage 15794 1726882606.89179: checking to see if all hosts have failed and the running result is not ok 15794 1726882606.89180: done checking to see if all hosts have failed 15794 1726882606.89181: getting the remaining hosts for this loop 15794 1726882606.89183: done getting the remaining hosts for this loop 15794 1726882606.89187: getting the next task for host managed_node1 15794 1726882606.89193: done getting next task for host managed_node1 15794 1726882606.89195: ^ task is: TASK: meta (flush_handlers) 15794 1726882606.89198: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882606.89202: getting variables 15794 1726882606.89203: in VariableManager get_vars() 15794 1726882606.89228: Calling all_inventory to load vars for managed_node1 15794 1726882606.89231: Calling groups_inventory to load vars for managed_node1 15794 1726882606.89238: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.89250: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.89254: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.89259: Calling groups_plugins_play to load vars for managed_node1 15794 1726882606.89478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.89770: done with get_vars() 15794 1726882606.89783: done getting variables 15794 1726882606.89870: in VariableManager get_vars() 15794 1726882606.89881: Calling all_inventory to load vars for managed_node1 15794 1726882606.89884: Calling groups_inventory to load vars for managed_node1 15794 1726882606.89888: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.89893: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.89896: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.89900: Calling groups_plugins_play to load vars for managed_node1 15794 1726882606.90179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.90543: done with get_vars() 15794 1726882606.90560: done queuing things up, now waiting for results queue to drain 15794 1726882606.90562: results queue empty 15794 1726882606.90563: checking for any_errors_fatal 15794 1726882606.90568: done checking for any_errors_fatal 15794 1726882606.90569: checking for max_fail_percentage 15794 1726882606.90571: done checking for max_fail_percentage 15794 1726882606.90572: checking to see if all hosts have failed and the running result is not ok 15794 1726882606.90573: done checking to see if all hosts have failed 15794 1726882606.90574: getting the remaining hosts for this loop 15794 1726882606.90580: done getting the remaining hosts for this loop 15794 1726882606.90583: getting the next task for host managed_node1 15794 1726882606.90593: done getting next task for host managed_node1 15794 1726882606.90596: ^ task is: TASK: Show inside ethernet tests 15794 1726882606.90598: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882606.90600: getting variables 15794 1726882606.90601: in VariableManager get_vars() 15794 1726882606.90612: Calling all_inventory to load vars for managed_node1 15794 1726882606.90614: Calling groups_inventory to load vars for managed_node1 15794 1726882606.90617: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.90623: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.90626: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.90630: Calling groups_plugins_play to load vars for managed_node1 15794 1726882606.90843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.91121: done with get_vars() 15794 1726882606.91131: done getting variables 15794 1726882606.91227: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 21:36:46 -0400 (0:00:00.962) 0:00:04.470 ****** 15794 1726882606.91266: entering _queue_task() for managed_node1/debug 15794 1726882606.91268: Creating lock for debug 15794 1726882606.91668: worker is 1 (out of 1 available) 15794 1726882606.91794: exiting _queue_task() for managed_node1/debug 15794 1726882606.91806: done queuing things up, now waiting for results queue to drain 15794 1726882606.91807: waiting for pending results... 15794 1726882606.91937: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 15794 1726882606.92042: in run() - task 0affe814-3a2d-94e5-e48f-00000000000b 15794 1726882606.92125: variable 'ansible_search_path' from source: unknown 15794 1726882606.92129: calling self._execute() 15794 1726882606.92191: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882606.92206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882606.92221: variable 'omit' from source: magic vars 15794 1726882606.92974: variable 'ansible_distribution_major_version' from source: facts 15794 1726882606.92978: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882606.92980: variable 'omit' from source: magic vars 15794 1726882606.92983: variable 'omit' from source: magic vars 15794 1726882606.92989: variable 'omit' from source: magic vars 15794 1726882606.93042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882606.93098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882606.93129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882606.93166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882606.93185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882606.93227: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882606.93240: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882606.93251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882606.93389: Set connection var ansible_connection to ssh 15794 1726882606.93408: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882606.93421: Set connection var ansible_pipelining to False 15794 1726882606.93433: Set connection var ansible_shell_executable to /bin/sh 15794 1726882606.93444: Set connection var ansible_shell_type to sh 15794 1726882606.93458: Set connection var ansible_timeout to 10 15794 1726882606.93501: variable 'ansible_shell_executable' from source: unknown 15794 1726882606.93518: variable 'ansible_connection' from source: unknown 15794 1726882606.93522: variable 'ansible_module_compression' from source: unknown 15794 1726882606.93595: variable 'ansible_shell_type' from source: unknown 15794 1726882606.93598: variable 'ansible_shell_executable' from source: unknown 15794 1726882606.93600: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882606.93602: variable 'ansible_pipelining' from source: unknown 15794 1726882606.93604: variable 'ansible_timeout' from source: unknown 15794 1726882606.93606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882606.93750: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882606.93769: variable 'omit' from source: magic vars 15794 1726882606.93780: starting attempt loop 15794 1726882606.93787: running the handler 15794 1726882606.93850: handler run complete 15794 1726882606.93959: attempt loop complete, returning result 15794 1726882606.93966: _execute() done 15794 1726882606.93974: dumping result to json 15794 1726882606.93982: done dumping result, returning 15794 1726882606.94030: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [0affe814-3a2d-94e5-e48f-00000000000b] 15794 1726882606.94034: sending task result for task 0affe814-3a2d-94e5-e48f-00000000000b ok: [managed_node1] => {} MSG: Inside ethernet tests 15794 1726882606.94220: no more pending results, returning what we have 15794 1726882606.94224: results queue empty 15794 1726882606.94226: checking for any_errors_fatal 15794 1726882606.94227: done checking for any_errors_fatal 15794 1726882606.94228: checking for max_fail_percentage 15794 1726882606.94231: done checking for max_fail_percentage 15794 1726882606.94231: checking to see if all hosts have failed and the running result is not ok 15794 1726882606.94232: done checking to see if all hosts have failed 15794 1726882606.94235: getting the remaining hosts for this loop 15794 1726882606.94238: done getting the remaining hosts for this loop 15794 1726882606.94243: getting the next task for host managed_node1 15794 1726882606.94250: done getting next task for host managed_node1 15794 1726882606.94253: ^ task is: TASK: Show network_provider 15794 1726882606.94256: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882606.94260: getting variables 15794 1726882606.94261: in VariableManager get_vars() 15794 1726882606.94298: Calling all_inventory to load vars for managed_node1 15794 1726882606.94301: Calling groups_inventory to load vars for managed_node1 15794 1726882606.94305: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.94317: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.94321: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.94325: Calling groups_plugins_play to load vars for managed_node1 15794 1726882606.94934: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000000b 15794 1726882606.94938: WORKER PROCESS EXITING 15794 1726882606.94966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.95261: done with get_vars() 15794 1726882606.95273: done getting variables 15794 1726882606.95350: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 21:36:46 -0400 (0:00:00.041) 0:00:04.511 ****** 15794 1726882606.95384: entering _queue_task() for managed_node1/debug 15794 1726882606.95785: worker is 1 (out of 1 available) 15794 1726882606.95798: exiting _queue_task() for managed_node1/debug 15794 1726882606.95808: done queuing things up, now waiting for results queue to drain 15794 1726882606.95810: waiting for pending results... 15794 1726882606.95998: running TaskExecutor() for managed_node1/TASK: Show network_provider 15794 1726882606.96107: in run() - task 0affe814-3a2d-94e5-e48f-00000000000c 15794 1726882606.96127: variable 'ansible_search_path' from source: unknown 15794 1726882606.96180: calling self._execute() 15794 1726882606.96274: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882606.96295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882606.96313: variable 'omit' from source: magic vars 15794 1726882606.96777: variable 'ansible_distribution_major_version' from source: facts 15794 1726882606.96804: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882606.96816: variable 'omit' from source: magic vars 15794 1726882606.96861: variable 'omit' from source: magic vars 15794 1726882606.96940: variable 'omit' from source: magic vars 15794 1726882606.96969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882606.97021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882606.97120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882606.97124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882606.97126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882606.97140: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882606.97149: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882606.97162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882606.97299: Set connection var ansible_connection to ssh 15794 1726882606.97314: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882606.97325: Set connection var ansible_pipelining to False 15794 1726882606.97339: Set connection var ansible_shell_executable to /bin/sh 15794 1726882606.97351: Set connection var ansible_shell_type to sh 15794 1726882606.97366: Set connection var ansible_timeout to 10 15794 1726882606.97411: variable 'ansible_shell_executable' from source: unknown 15794 1726882606.97421: variable 'ansible_connection' from source: unknown 15794 1726882606.97458: variable 'ansible_module_compression' from source: unknown 15794 1726882606.97462: variable 'ansible_shell_type' from source: unknown 15794 1726882606.97464: variable 'ansible_shell_executable' from source: unknown 15794 1726882606.97467: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882606.97469: variable 'ansible_pipelining' from source: unknown 15794 1726882606.97471: variable 'ansible_timeout' from source: unknown 15794 1726882606.97473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882606.97655: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882606.97704: variable 'omit' from source: magic vars 15794 1726882606.97708: starting attempt loop 15794 1726882606.97711: running the handler 15794 1726882606.97758: variable 'network_provider' from source: set_fact 15794 1726882606.97866: variable 'network_provider' from source: set_fact 15794 1726882606.97921: handler run complete 15794 1726882606.97937: attempt loop complete, returning result 15794 1726882606.97946: _execute() done 15794 1726882606.98030: dumping result to json 15794 1726882606.98035: done dumping result, returning 15794 1726882606.98040: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0affe814-3a2d-94e5-e48f-00000000000c] 15794 1726882606.98042: sending task result for task 0affe814-3a2d-94e5-e48f-00000000000c 15794 1726882606.98115: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000000c 15794 1726882606.98118: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 15794 1726882606.98176: no more pending results, returning what we have 15794 1726882606.98183: results queue empty 15794 1726882606.98184: checking for any_errors_fatal 15794 1726882606.98193: done checking for any_errors_fatal 15794 1726882606.98194: checking for max_fail_percentage 15794 1726882606.98197: done checking for max_fail_percentage 15794 1726882606.98198: checking to see if all hosts have failed and the running result is not ok 15794 1726882606.98199: done checking to see if all hosts have failed 15794 1726882606.98200: getting the remaining hosts for this loop 15794 1726882606.98202: done getting the remaining hosts for this loop 15794 1726882606.98207: getting the next task for host managed_node1 15794 1726882606.98215: done getting next task for host managed_node1 15794 1726882606.98217: ^ task is: TASK: meta (flush_handlers) 15794 1726882606.98220: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882606.98225: getting variables 15794 1726882606.98227: in VariableManager get_vars() 15794 1726882606.98260: Calling all_inventory to load vars for managed_node1 15794 1726882606.98263: Calling groups_inventory to load vars for managed_node1 15794 1726882606.98268: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.98281: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.98285: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.98289: Calling groups_plugins_play to load vars for managed_node1 15794 1726882606.98788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.99085: done with get_vars() 15794 1726882606.99102: done getting variables 15794 1726882606.99181: in VariableManager get_vars() 15794 1726882606.99192: Calling all_inventory to load vars for managed_node1 15794 1726882606.99200: Calling groups_inventory to load vars for managed_node1 15794 1726882606.99203: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.99209: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.99213: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.99217: Calling groups_plugins_play to load vars for managed_node1 15794 1726882606.99452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882606.99748: done with get_vars() 15794 1726882606.99762: done queuing things up, now waiting for results queue to drain 15794 1726882606.99764: results queue empty 15794 1726882606.99765: checking for any_errors_fatal 15794 1726882606.99769: done checking for any_errors_fatal 15794 1726882606.99770: checking for max_fail_percentage 15794 1726882606.99771: done checking for max_fail_percentage 15794 1726882606.99772: checking to see if all hosts have failed and the running result is not ok 15794 1726882606.99773: done checking to see if all hosts have failed 15794 1726882606.99774: getting the remaining hosts for this loop 15794 1726882606.99775: done getting the remaining hosts for this loop 15794 1726882606.99780: getting the next task for host managed_node1 15794 1726882606.99790: done getting next task for host managed_node1 15794 1726882606.99792: ^ task is: TASK: meta (flush_handlers) 15794 1726882606.99794: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882606.99797: getting variables 15794 1726882606.99798: in VariableManager get_vars() 15794 1726882606.99808: Calling all_inventory to load vars for managed_node1 15794 1726882606.99811: Calling groups_inventory to load vars for managed_node1 15794 1726882606.99814: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882606.99819: Calling all_plugins_play to load vars for managed_node1 15794 1726882606.99822: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882606.99826: Calling groups_plugins_play to load vars for managed_node1 15794 1726882607.00029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.00320: done with get_vars() 15794 1726882607.00330: done getting variables 15794 1726882607.00393: in VariableManager get_vars() 15794 1726882607.00403: Calling all_inventory to load vars for managed_node1 15794 1726882607.00406: Calling groups_inventory to load vars for managed_node1 15794 1726882607.00409: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882607.00414: Calling all_plugins_play to load vars for managed_node1 15794 1726882607.00417: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882607.00420: Calling groups_plugins_play to load vars for managed_node1 15794 1726882607.00651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.00936: done with get_vars() 15794 1726882607.00952: done queuing things up, now waiting for results queue to drain 15794 1726882607.00954: results queue empty 15794 1726882607.00955: checking for any_errors_fatal 15794 1726882607.00956: done checking for any_errors_fatal 15794 1726882607.00957: checking for max_fail_percentage 15794 1726882607.00958: done checking for max_fail_percentage 15794 1726882607.00959: checking to see if all hosts have failed and the running result is not ok 15794 1726882607.00960: done checking to see if all hosts have failed 15794 1726882607.00961: getting the remaining hosts for this loop 15794 1726882607.00962: done getting the remaining hosts for this loop 15794 1726882607.00965: getting the next task for host managed_node1 15794 1726882607.00968: done getting next task for host managed_node1 15794 1726882607.00969: ^ task is: None 15794 1726882607.00970: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882607.00972: done queuing things up, now waiting for results queue to drain 15794 1726882607.00973: results queue empty 15794 1726882607.00974: checking for any_errors_fatal 15794 1726882607.00974: done checking for any_errors_fatal 15794 1726882607.00975: checking for max_fail_percentage 15794 1726882607.00977: done checking for max_fail_percentage 15794 1726882607.00977: checking to see if all hosts have failed and the running result is not ok 15794 1726882607.00981: done checking to see if all hosts have failed 15794 1726882607.00983: getting the next task for host managed_node1 15794 1726882607.00986: done getting next task for host managed_node1 15794 1726882607.00987: ^ task is: None 15794 1726882607.00989: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882607.01029: in VariableManager get_vars() 15794 1726882607.01052: done with get_vars() 15794 1726882607.01059: in VariableManager get_vars() 15794 1726882607.01070: done with get_vars() 15794 1726882607.01075: variable 'omit' from source: magic vars 15794 1726882607.01112: in VariableManager get_vars() 15794 1726882607.01124: done with get_vars() 15794 1726882607.01157: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 15794 1726882607.01365: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882607.01461: getting the remaining hosts for this loop 15794 1726882607.01463: done getting the remaining hosts for this loop 15794 1726882607.01466: getting the next task for host managed_node1 15794 1726882607.01469: done getting next task for host managed_node1 15794 1726882607.01471: ^ task is: TASK: Gathering Facts 15794 1726882607.01473: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882607.01476: getting variables 15794 1726882607.01477: in VariableManager get_vars() 15794 1726882607.01486: Calling all_inventory to load vars for managed_node1 15794 1726882607.01489: Calling groups_inventory to load vars for managed_node1 15794 1726882607.01492: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882607.01498: Calling all_plugins_play to load vars for managed_node1 15794 1726882607.01501: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882607.01504: Calling groups_plugins_play to load vars for managed_node1 15794 1726882607.01696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.01957: done with get_vars() 15794 1726882607.01972: done getting variables 15794 1726882607.02020: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 21:36:47 -0400 (0:00:00.066) 0:00:04.578 ****** 15794 1726882607.02050: entering _queue_task() for managed_node1/gather_facts 15794 1726882607.02367: worker is 1 (out of 1 available) 15794 1726882607.02380: exiting _queue_task() for managed_node1/gather_facts 15794 1726882607.02396: done queuing things up, now waiting for results queue to drain 15794 1726882607.02398: waiting for pending results... 15794 1726882607.02745: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882607.02755: in run() - task 0affe814-3a2d-94e5-e48f-0000000000f0 15794 1726882607.02779: variable 'ansible_search_path' from source: unknown 15794 1726882607.02826: calling self._execute() 15794 1726882607.02923: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882607.02939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882607.02961: variable 'omit' from source: magic vars 15794 1726882607.03397: variable 'ansible_distribution_major_version' from source: facts 15794 1726882607.03416: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882607.03428: variable 'omit' from source: magic vars 15794 1726882607.03496: variable 'omit' from source: magic vars 15794 1726882607.03519: variable 'omit' from source: magic vars 15794 1726882607.03569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882607.03619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882607.03649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882607.03714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882607.03717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882607.03737: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882607.03748: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882607.03757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882607.03885: Set connection var ansible_connection to ssh 15794 1726882607.03902: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882607.03916: Set connection var ansible_pipelining to False 15794 1726882607.04040: Set connection var ansible_shell_executable to /bin/sh 15794 1726882607.04043: Set connection var ansible_shell_type to sh 15794 1726882607.04046: Set connection var ansible_timeout to 10 15794 1726882607.04048: variable 'ansible_shell_executable' from source: unknown 15794 1726882607.04050: variable 'ansible_connection' from source: unknown 15794 1726882607.04052: variable 'ansible_module_compression' from source: unknown 15794 1726882607.04055: variable 'ansible_shell_type' from source: unknown 15794 1726882607.04057: variable 'ansible_shell_executable' from source: unknown 15794 1726882607.04059: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882607.04061: variable 'ansible_pipelining' from source: unknown 15794 1726882607.04063: variable 'ansible_timeout' from source: unknown 15794 1726882607.04065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882607.04258: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882607.04274: variable 'omit' from source: magic vars 15794 1726882607.04290: starting attempt loop 15794 1726882607.04298: running the handler 15794 1726882607.04322: variable 'ansible_facts' from source: unknown 15794 1726882607.04349: _low_level_execute_command(): starting 15794 1726882607.04363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882607.05053: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.05073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882607.05096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.05144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882607.05148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882607.05222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882607.07148: stdout chunk (state=3): >>>/root <<< 15794 1726882607.07153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882607.07484: stderr chunk (state=3): >>><<< 15794 1726882607.07488: stdout chunk (state=3): >>><<< 15794 1726882607.07492: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882607.07495: _low_level_execute_command(): starting 15794 1726882607.07497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541 `" && echo ansible-tmp-1726882607.0741663-15963-34224389918541="` echo /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541 `" ) && sleep 0' 15794 1726882607.08594: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882607.08610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882607.08650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882607.08679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882607.08713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882607.08792: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.08821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882607.08849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882607.08870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882607.08964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882607.11011: stdout chunk (state=3): >>>ansible-tmp-1726882607.0741663-15963-34224389918541=/root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541 <<< 15794 1726882607.11199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882607.11220: stderr chunk (state=3): >>><<< 15794 1726882607.11230: stdout chunk (state=3): >>><<< 15794 1726882607.11261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882607.0741663-15963-34224389918541=/root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882607.11455: variable 'ansible_module_compression' from source: unknown 15794 1726882607.11458: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882607.11460: variable 'ansible_facts' from source: unknown 15794 1726882607.11625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py 15794 1726882607.12016: Sending initial data 15794 1726882607.12032: Sent initial data (153 bytes) 15794 1726882607.12411: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.12425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882607.12441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.12491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882607.12505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882607.12569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882607.14169: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882607.14247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882607.14312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpenhq13nr /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py <<< 15794 1726882607.14316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py" <<< 15794 1726882607.14374: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpenhq13nr" to remote "/root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py" <<< 15794 1726882607.16769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882607.16798: stderr chunk (state=3): >>><<< 15794 1726882607.16807: stdout chunk (state=3): >>><<< 15794 1726882607.16839: done transferring module to remote 15794 1726882607.16875: _low_level_execute_command(): starting 15794 1726882607.16878: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/ /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py && sleep 0' 15794 1726882607.17414: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.17420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882607.17423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.17426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.17428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.17492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882607.17495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882607.17548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882607.19523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882607.19526: stdout chunk (state=3): >>><<< 15794 1726882607.19529: stderr chunk (state=3): >>><<< 15794 1726882607.19548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882607.19557: _low_level_execute_command(): starting 15794 1726882607.19568: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/AnsiballZ_setup.py && sleep 0' 15794 1726882607.20157: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.20160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.20163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.20165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.20217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882607.20224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882607.20286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882607.88452: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2847, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 870, "free": 2847}, "nocache": {"free": 3451, "used": 266}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid":<<< 15794 1726882607.88461: stdout chunk (state=3): >>> "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205443584, "block_size": 4096, "block_total": 64483404, "block_available": 61329454, "block_used": 3153950, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.4375, "5m": 0.4248046875, "15m": 0.20849609375}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number"<<< 15794 1726882607.88475: stdout chunk (state=3): >>>: "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "47", "epoch": "1726882607", "epoch_int": "1726882607", "date": "2024-09-20", "time": "21:36:47", "iso8601_micro": "2024-09-21T01:36:47.854693Z", "iso8601": "2024-09-21T01:36:47Z", "iso8601_basic": "20240920T213647854693", "iso8601_basic_short": "20240920T213647", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882607.90630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882607.90695: stderr chunk (state=3): >>><<< 15794 1726882607.90700: stdout chunk (state=3): >>><<< 15794 1726882607.90724: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2847, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 870, "free": 2847}, "nocache": {"free": 3451, "used": 266}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205443584, "block_size": 4096, "block_total": 64483404, "block_available": 61329454, "block_used": 3153950, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.4375, "5m": 0.4248046875, "15m": 0.20849609375}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "47", "epoch": "1726882607", "epoch_int": "1726882607", "date": "2024-09-20", "time": "21:36:47", "iso8601_micro": "2024-09-21T01:36:47.854693Z", "iso8601": "2024-09-21T01:36:47Z", "iso8601_basic": "20240920T213647854693", "iso8601_basic_short": "20240920T213647", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882607.90961: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882607.90985: _low_level_execute_command(): starting 15794 1726882607.90990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882607.0741663-15963-34224389918541/ > /dev/null 2>&1 && sleep 0' 15794 1726882607.91481: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882607.91485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882607.91487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.91490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882607.91492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882607.91541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882607.91544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882607.91610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882607.93611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882607.93639: stdout chunk (state=3): >>><<< 15794 1726882607.93642: stderr chunk (state=3): >>><<< 15794 1726882607.93652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882607.93663: handler run complete 15794 1726882607.94041: variable 'ansible_facts' from source: unknown 15794 1726882607.94044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.94538: variable 'ansible_facts' from source: unknown 15794 1726882607.94677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.95439: attempt loop complete, returning result 15794 1726882607.95443: _execute() done 15794 1726882607.95445: dumping result to json 15794 1726882607.95447: done dumping result, returning 15794 1726882607.95450: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-0000000000f0] 15794 1726882607.95452: sending task result for task 0affe814-3a2d-94e5-e48f-0000000000f0 15794 1726882607.95813: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000000f0 15794 1726882607.95816: WORKER PROCESS EXITING ok: [managed_node1] 15794 1726882607.96564: no more pending results, returning what we have 15794 1726882607.96685: results queue empty 15794 1726882607.96686: checking for any_errors_fatal 15794 1726882607.96688: done checking for any_errors_fatal 15794 1726882607.96689: checking for max_fail_percentage 15794 1726882607.96691: done checking for max_fail_percentage 15794 1726882607.96692: checking to see if all hosts have failed and the running result is not ok 15794 1726882607.96692: done checking to see if all hosts have failed 15794 1726882607.96693: getting the remaining hosts for this loop 15794 1726882607.96695: done getting the remaining hosts for this loop 15794 1726882607.96699: getting the next task for host managed_node1 15794 1726882607.96704: done getting next task for host managed_node1 15794 1726882607.96706: ^ task is: TASK: meta (flush_handlers) 15794 1726882607.96708: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882607.96712: getting variables 15794 1726882607.96714: in VariableManager get_vars() 15794 1726882607.96904: Calling all_inventory to load vars for managed_node1 15794 1726882607.96908: Calling groups_inventory to load vars for managed_node1 15794 1726882607.96912: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882607.96928: Calling all_plugins_play to load vars for managed_node1 15794 1726882607.96931: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882607.96937: Calling groups_plugins_play to load vars for managed_node1 15794 1726882607.97368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.97953: done with get_vars() 15794 1726882607.97968: done getting variables 15794 1726882607.98175: in VariableManager get_vars() 15794 1726882607.98192: Calling all_inventory to load vars for managed_node1 15794 1726882607.98195: Calling groups_inventory to load vars for managed_node1 15794 1726882607.98198: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882607.98205: Calling all_plugins_play to load vars for managed_node1 15794 1726882607.98323: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882607.98329: Calling groups_plugins_play to load vars for managed_node1 15794 1726882607.98756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882607.99290: done with get_vars() 15794 1726882607.99306: done queuing things up, now waiting for results queue to drain 15794 1726882607.99308: results queue empty 15794 1726882607.99310: checking for any_errors_fatal 15794 1726882607.99315: done checking for any_errors_fatal 15794 1726882607.99316: checking for max_fail_percentage 15794 1726882607.99317: done checking for max_fail_percentage 15794 1726882607.99437: checking to see if all hosts have failed and the running result is not ok 15794 1726882607.99440: done checking to see if all hosts have failed 15794 1726882607.99441: getting the remaining hosts for this loop 15794 1726882607.99442: done getting the remaining hosts for this loop 15794 1726882607.99446: getting the next task for host managed_node1 15794 1726882607.99451: done getting next task for host managed_node1 15794 1726882607.99454: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 15794 1726882607.99455: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882607.99458: getting variables 15794 1726882607.99459: in VariableManager get_vars() 15794 1726882607.99470: Calling all_inventory to load vars for managed_node1 15794 1726882607.99473: Calling groups_inventory to load vars for managed_node1 15794 1726882607.99476: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882607.99484: Calling all_plugins_play to load vars for managed_node1 15794 1726882607.99487: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882607.99491: Calling groups_plugins_play to load vars for managed_node1 15794 1726882607.99882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.00399: done with get_vars() 15794 1726882608.00410: done getting variables 15794 1726882608.00463: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882608.00666: variable 'type' from source: play vars 15794 1726882608.00672: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 21:36:48 -0400 (0:00:00.986) 0:00:05.565 ****** 15794 1726882608.00716: entering _queue_task() for managed_node1/set_fact 15794 1726882608.01039: worker is 1 (out of 1 available) 15794 1726882608.01051: exiting _queue_task() for managed_node1/set_fact 15794 1726882608.01065: done queuing things up, now waiting for results queue to drain 15794 1726882608.01066: waiting for pending results... 15794 1726882608.01409: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 15794 1726882608.01454: in run() - task 0affe814-3a2d-94e5-e48f-00000000000f 15794 1726882608.01505: variable 'ansible_search_path' from source: unknown 15794 1726882608.01526: calling self._execute() 15794 1726882608.01618: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.01632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.01650: variable 'omit' from source: magic vars 15794 1726882608.02161: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.02165: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.02168: variable 'omit' from source: magic vars 15794 1726882608.02171: variable 'omit' from source: magic vars 15794 1726882608.02184: variable 'type' from source: play vars 15794 1726882608.02283: variable 'type' from source: play vars 15794 1726882608.02297: variable 'interface' from source: play vars 15794 1726882608.02383: variable 'interface' from source: play vars 15794 1726882608.02407: variable 'omit' from source: magic vars 15794 1726882608.02457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882608.02542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882608.02571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882608.02604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.02622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.02664: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882608.02675: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.02705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.02822: Set connection var ansible_connection to ssh 15794 1726882608.02924: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882608.02927: Set connection var ansible_pipelining to False 15794 1726882608.02930: Set connection var ansible_shell_executable to /bin/sh 15794 1726882608.02932: Set connection var ansible_shell_type to sh 15794 1726882608.02942: Set connection var ansible_timeout to 10 15794 1726882608.02944: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.02947: variable 'ansible_connection' from source: unknown 15794 1726882608.02949: variable 'ansible_module_compression' from source: unknown 15794 1726882608.02951: variable 'ansible_shell_type' from source: unknown 15794 1726882608.02953: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.02955: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.02965: variable 'ansible_pipelining' from source: unknown 15794 1726882608.02974: variable 'ansible_timeout' from source: unknown 15794 1726882608.02986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.03222: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882608.03243: variable 'omit' from source: magic vars 15794 1726882608.03260: starting attempt loop 15794 1726882608.03269: running the handler 15794 1726882608.03292: handler run complete 15794 1726882608.03308: attempt loop complete, returning result 15794 1726882608.03315: _execute() done 15794 1726882608.03322: dumping result to json 15794 1726882608.03338: done dumping result, returning 15794 1726882608.03345: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [0affe814-3a2d-94e5-e48f-00000000000f] 15794 1726882608.03364: sending task result for task 0affe814-3a2d-94e5-e48f-00000000000f ok: [managed_node1] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 15794 1726882608.03641: no more pending results, returning what we have 15794 1726882608.03645: results queue empty 15794 1726882608.03646: checking for any_errors_fatal 15794 1726882608.03648: done checking for any_errors_fatal 15794 1726882608.03649: checking for max_fail_percentage 15794 1726882608.03652: done checking for max_fail_percentage 15794 1726882608.03653: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.03653: done checking to see if all hosts have failed 15794 1726882608.03654: getting the remaining hosts for this loop 15794 1726882608.03657: done getting the remaining hosts for this loop 15794 1726882608.03661: getting the next task for host managed_node1 15794 1726882608.03668: done getting next task for host managed_node1 15794 1726882608.03672: ^ task is: TASK: Include the task 'show_interfaces.yml' 15794 1726882608.03674: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.03682: getting variables 15794 1726882608.03684: in VariableManager get_vars() 15794 1726882608.03717: Calling all_inventory to load vars for managed_node1 15794 1726882608.03720: Calling groups_inventory to load vars for managed_node1 15794 1726882608.03725: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.03742: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.03746: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.03751: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.04190: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000000f 15794 1726882608.04194: WORKER PROCESS EXITING 15794 1726882608.04219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.04504: done with get_vars() 15794 1726882608.04515: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 21:36:48 -0400 (0:00:00.039) 0:00:05.604 ****** 15794 1726882608.04620: entering _queue_task() for managed_node1/include_tasks 15794 1726882608.04883: worker is 1 (out of 1 available) 15794 1726882608.04896: exiting _queue_task() for managed_node1/include_tasks 15794 1726882608.05023: done queuing things up, now waiting for results queue to drain 15794 1726882608.05025: waiting for pending results... 15794 1726882608.05183: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 15794 1726882608.05297: in run() - task 0affe814-3a2d-94e5-e48f-000000000010 15794 1726882608.05317: variable 'ansible_search_path' from source: unknown 15794 1726882608.05368: calling self._execute() 15794 1726882608.05462: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.05482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.05499: variable 'omit' from source: magic vars 15794 1726882608.05922: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.05943: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.05955: _execute() done 15794 1726882608.05966: dumping result to json 15794 1726882608.06146: done dumping result, returning 15794 1726882608.06360: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-94e5-e48f-000000000010] 15794 1726882608.06364: sending task result for task 0affe814-3a2d-94e5-e48f-000000000010 15794 1726882608.06439: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000010 15794 1726882608.06442: WORKER PROCESS EXITING 15794 1726882608.06495: no more pending results, returning what we have 15794 1726882608.06501: in VariableManager get_vars() 15794 1726882608.06537: Calling all_inventory to load vars for managed_node1 15794 1726882608.06541: Calling groups_inventory to load vars for managed_node1 15794 1726882608.06545: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.06557: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.06561: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.06564: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.07241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.07718: done with get_vars() 15794 1726882608.07726: variable 'ansible_search_path' from source: unknown 15794 1726882608.07742: we have included files to process 15794 1726882608.07743: generating all_blocks data 15794 1726882608.07745: done generating all_blocks data 15794 1726882608.07746: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15794 1726882608.07747: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15794 1726882608.07750: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15794 1726882608.08138: in VariableManager get_vars() 15794 1726882608.08157: done with get_vars() 15794 1726882608.08494: done processing included file 15794 1726882608.08497: iterating over new_blocks loaded from include file 15794 1726882608.08498: in VariableManager get_vars() 15794 1726882608.08543: done with get_vars() 15794 1726882608.08545: filtering new block on tags 15794 1726882608.08566: done filtering new block on tags 15794 1726882608.08569: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 15794 1726882608.08575: extending task lists for all hosts with included blocks 15794 1726882608.08890: done extending task lists 15794 1726882608.08892: done processing included files 15794 1726882608.08893: results queue empty 15794 1726882608.08894: checking for any_errors_fatal 15794 1726882608.08897: done checking for any_errors_fatal 15794 1726882608.08898: checking for max_fail_percentage 15794 1726882608.08900: done checking for max_fail_percentage 15794 1726882608.08901: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.08902: done checking to see if all hosts have failed 15794 1726882608.08903: getting the remaining hosts for this loop 15794 1726882608.08904: done getting the remaining hosts for this loop 15794 1726882608.08907: getting the next task for host managed_node1 15794 1726882608.08911: done getting next task for host managed_node1 15794 1726882608.08914: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15794 1726882608.08917: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.08920: getting variables 15794 1726882608.08921: in VariableManager get_vars() 15794 1726882608.08930: Calling all_inventory to load vars for managed_node1 15794 1726882608.08933: Calling groups_inventory to load vars for managed_node1 15794 1726882608.08938: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.08944: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.08947: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.08951: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.09352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.09831: done with get_vars() 15794 1726882608.10045: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:48 -0400 (0:00:00.055) 0:00:05.659 ****** 15794 1726882608.10131: entering _queue_task() for managed_node1/include_tasks 15794 1726882608.10624: worker is 1 (out of 1 available) 15794 1726882608.11317: exiting _queue_task() for managed_node1/include_tasks 15794 1726882608.11329: done queuing things up, now waiting for results queue to drain 15794 1726882608.11330: waiting for pending results... 15794 1726882608.11455: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 15794 1726882608.11462: in run() - task 0affe814-3a2d-94e5-e48f-000000000104 15794 1726882608.11466: variable 'ansible_search_path' from source: unknown 15794 1726882608.11470: variable 'ansible_search_path' from source: unknown 15794 1726882608.11543: calling self._execute() 15794 1726882608.11741: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.11744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.11747: variable 'omit' from source: magic vars 15794 1726882608.12828: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.12832: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.12838: _execute() done 15794 1726882608.12841: dumping result to json 15794 1726882608.12843: done dumping result, returning 15794 1726882608.12845: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-94e5-e48f-000000000104] 15794 1726882608.12847: sending task result for task 0affe814-3a2d-94e5-e48f-000000000104 15794 1726882608.12917: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000104 15794 1726882608.12922: WORKER PROCESS EXITING 15794 1726882608.12959: no more pending results, returning what we have 15794 1726882608.12965: in VariableManager get_vars() 15794 1726882608.13011: Calling all_inventory to load vars for managed_node1 15794 1726882608.13015: Calling groups_inventory to load vars for managed_node1 15794 1726882608.13020: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.13039: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.13043: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.13048: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.13709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.14689: done with get_vars() 15794 1726882608.14698: variable 'ansible_search_path' from source: unknown 15794 1726882608.14699: variable 'ansible_search_path' from source: unknown 15794 1726882608.14777: we have included files to process 15794 1726882608.14781: generating all_blocks data 15794 1726882608.14783: done generating all_blocks data 15794 1726882608.14785: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15794 1726882608.14786: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15794 1726882608.14789: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15794 1726882608.16045: done processing included file 15794 1726882608.16048: iterating over new_blocks loaded from include file 15794 1726882608.16050: in VariableManager get_vars() 15794 1726882608.16066: done with get_vars() 15794 1726882608.16068: filtering new block on tags 15794 1726882608.16095: done filtering new block on tags 15794 1726882608.16098: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 15794 1726882608.16104: extending task lists for all hosts with included blocks 15794 1726882608.16449: done extending task lists 15794 1726882608.16451: done processing included files 15794 1726882608.16452: results queue empty 15794 1726882608.16453: checking for any_errors_fatal 15794 1726882608.16457: done checking for any_errors_fatal 15794 1726882608.16458: checking for max_fail_percentage 15794 1726882608.16460: done checking for max_fail_percentage 15794 1726882608.16461: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.16462: done checking to see if all hosts have failed 15794 1726882608.16463: getting the remaining hosts for this loop 15794 1726882608.16464: done getting the remaining hosts for this loop 15794 1726882608.16467: getting the next task for host managed_node1 15794 1726882608.16472: done getting next task for host managed_node1 15794 1726882608.16474: ^ task is: TASK: Gather current interface info 15794 1726882608.16481: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.16484: getting variables 15794 1726882608.16486: in VariableManager get_vars() 15794 1726882608.16498: Calling all_inventory to load vars for managed_node1 15794 1726882608.16501: Calling groups_inventory to load vars for managed_node1 15794 1726882608.16504: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.16511: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.16514: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.16518: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.16920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.18229: done with get_vars() 15794 1726882608.18243: done getting variables 15794 1726882608.18293: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:48 -0400 (0:00:00.081) 0:00:05.741 ****** 15794 1726882608.18326: entering _queue_task() for managed_node1/command 15794 1726882608.18755: worker is 1 (out of 1 available) 15794 1726882608.18768: exiting _queue_task() for managed_node1/command 15794 1726882608.18780: done queuing things up, now waiting for results queue to drain 15794 1726882608.18782: waiting for pending results... 15794 1726882608.18966: running TaskExecutor() for managed_node1/TASK: Gather current interface info 15794 1726882608.19242: in run() - task 0affe814-3a2d-94e5-e48f-000000000115 15794 1726882608.19247: variable 'ansible_search_path' from source: unknown 15794 1726882608.19250: variable 'ansible_search_path' from source: unknown 15794 1726882608.19253: calling self._execute() 15794 1726882608.19289: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.19303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.19318: variable 'omit' from source: magic vars 15794 1726882608.19791: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.19823: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.19874: variable 'omit' from source: magic vars 15794 1726882608.19950: variable 'omit' from source: magic vars 15794 1726882608.19997: variable 'omit' from source: magic vars 15794 1726882608.20050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882608.20098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882608.20134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882608.20181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.20219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.20340: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882608.20343: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.20348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.20411: Set connection var ansible_connection to ssh 15794 1726882608.20429: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882608.20448: Set connection var ansible_pipelining to False 15794 1726882608.20466: Set connection var ansible_shell_executable to /bin/sh 15794 1726882608.20475: Set connection var ansible_shell_type to sh 15794 1726882608.20493: Set connection var ansible_timeout to 10 15794 1726882608.20922: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.20926: variable 'ansible_connection' from source: unknown 15794 1726882608.20929: variable 'ansible_module_compression' from source: unknown 15794 1726882608.20931: variable 'ansible_shell_type' from source: unknown 15794 1726882608.20936: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.20938: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.20940: variable 'ansible_pipelining' from source: unknown 15794 1726882608.20942: variable 'ansible_timeout' from source: unknown 15794 1726882608.20946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.21080: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882608.21101: variable 'omit' from source: magic vars 15794 1726882608.21247: starting attempt loop 15794 1726882608.21251: running the handler 15794 1726882608.21253: _low_level_execute_command(): starting 15794 1726882608.21255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882608.22165: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882608.22257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882608.22264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.22346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882608.22385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882608.22447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882608.24249: stdout chunk (state=3): >>>/root <<< 15794 1726882608.24561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882608.24565: stdout chunk (state=3): >>><<< 15794 1726882608.24567: stderr chunk (state=3): >>><<< 15794 1726882608.24730: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882608.24736: _low_level_execute_command(): starting 15794 1726882608.24740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507 `" && echo ansible-tmp-1726882608.246359-16009-66287587888507="` echo /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507 `" ) && sleep 0' 15794 1726882608.25948: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882608.25958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882608.25980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882608.26098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882608.28096: stdout chunk (state=3): >>>ansible-tmp-1726882608.246359-16009-66287587888507=/root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507 <<< 15794 1726882608.28341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882608.28344: stdout chunk (state=3): >>><<< 15794 1726882608.28347: stderr chunk (state=3): >>><<< 15794 1726882608.28349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882608.246359-16009-66287587888507=/root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882608.28365: variable 'ansible_module_compression' from source: unknown 15794 1726882608.28423: ANSIBALLZ: Using generic lock for ansible.legacy.command 15794 1726882608.28427: ANSIBALLZ: Acquiring lock 15794 1726882608.28430: ANSIBALLZ: Lock acquired: 139758818400528 15794 1726882608.28432: ANSIBALLZ: Creating module 15794 1726882608.48589: ANSIBALLZ: Writing module into payload 15794 1726882608.48719: ANSIBALLZ: Writing module 15794 1726882608.48752: ANSIBALLZ: Renaming module 15794 1726882608.48771: ANSIBALLZ: Done creating module 15794 1726882608.48797: variable 'ansible_facts' from source: unknown 15794 1726882608.48884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py 15794 1726882608.49058: Sending initial data 15794 1726882608.49068: Sent initial data (154 bytes) 15794 1726882608.49763: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882608.49845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.49865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.49910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882608.49940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882608.50043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882608.51751: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882608.51804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882608.51862: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp7xzvc5en /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py <<< 15794 1726882608.51870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py" <<< 15794 1726882608.51919: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp7xzvc5en" to remote "/root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py" <<< 15794 1726882608.52853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882608.52900: stderr chunk (state=3): >>><<< 15794 1726882608.52904: stdout chunk (state=3): >>><<< 15794 1726882608.53040: done transferring module to remote 15794 1726882608.53044: _low_level_execute_command(): starting 15794 1726882608.53047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/ /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py && sleep 0' 15794 1726882608.53625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.53685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882608.53710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882608.53762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882608.55667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882608.55707: stderr chunk (state=3): >>><<< 15794 1726882608.55720: stdout chunk (state=3): >>><<< 15794 1726882608.55752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882608.55755: _low_level_execute_command(): starting 15794 1726882608.55759: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/AnsiballZ_command.py && sleep 0' 15794 1726882608.56441: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882608.56445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882608.56448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882608.56454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882608.56539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882608.56543: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882608.56546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.56548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882608.56551: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882608.56553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15794 1726882608.56555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882608.56561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882608.56622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.56670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882608.56684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882608.56706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882608.56793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882608.74307: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:48.737629", "end": "2024-09-20 21:36:48.741173", "delta": "0:00:00.003544", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882608.76223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882608.76228: stdout chunk (state=3): >>><<< 15794 1726882608.76231: stderr chunk (state=3): >>><<< 15794 1726882608.76236: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:48.737629", "end": "2024-09-20 21:36:48.741173", "delta": "0:00:00.003544", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882608.76239: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882608.76242: _low_level_execute_command(): starting 15794 1726882608.76244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882608.246359-16009-66287587888507/ > /dev/null 2>&1 && sleep 0' 15794 1726882608.76906: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882608.76931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882608.76951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882608.77002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882608.77019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882608.77099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882608.77113: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882608.77153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882608.77171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882608.77196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882608.77295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882608.79322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882608.79326: stdout chunk (state=3): >>><<< 15794 1726882608.79328: stderr chunk (state=3): >>><<< 15794 1726882608.79540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882608.79543: handler run complete 15794 1726882608.79546: Evaluated conditional (False): False 15794 1726882608.79548: attempt loop complete, returning result 15794 1726882608.79550: _execute() done 15794 1726882608.79552: dumping result to json 15794 1726882608.79554: done dumping result, returning 15794 1726882608.79557: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affe814-3a2d-94e5-e48f-000000000115] 15794 1726882608.79559: sending task result for task 0affe814-3a2d-94e5-e48f-000000000115 15794 1726882608.79639: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000115 15794 1726882608.79642: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003544", "end": "2024-09-20 21:36:48.741173", "rc": 0, "start": "2024-09-20 21:36:48.737629" } STDOUT: bonding_masters eth0 lo 15794 1726882608.79746: no more pending results, returning what we have 15794 1726882608.79750: results queue empty 15794 1726882608.79751: checking for any_errors_fatal 15794 1726882608.79753: done checking for any_errors_fatal 15794 1726882608.79754: checking for max_fail_percentage 15794 1726882608.79756: done checking for max_fail_percentage 15794 1726882608.79757: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.79758: done checking to see if all hosts have failed 15794 1726882608.79759: getting the remaining hosts for this loop 15794 1726882608.79762: done getting the remaining hosts for this loop 15794 1726882608.79766: getting the next task for host managed_node1 15794 1726882608.79775: done getting next task for host managed_node1 15794 1726882608.79781: ^ task is: TASK: Set current_interfaces 15794 1726882608.79786: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.79791: getting variables 15794 1726882608.79793: in VariableManager get_vars() 15794 1726882608.79827: Calling all_inventory to load vars for managed_node1 15794 1726882608.79830: Calling groups_inventory to load vars for managed_node1 15794 1726882608.79951: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.79965: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.79969: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.79973: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.80458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.80777: done with get_vars() 15794 1726882608.80793: done getting variables 15794 1726882608.80874: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:48 -0400 (0:00:00.625) 0:00:06.367 ****** 15794 1726882608.80912: entering _queue_task() for managed_node1/set_fact 15794 1726882608.81215: worker is 1 (out of 1 available) 15794 1726882608.81227: exiting _queue_task() for managed_node1/set_fact 15794 1726882608.81374: done queuing things up, now waiting for results queue to drain 15794 1726882608.81376: waiting for pending results... 15794 1726882608.81601: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 15794 1726882608.81697: in run() - task 0affe814-3a2d-94e5-e48f-000000000116 15794 1726882608.81702: variable 'ansible_search_path' from source: unknown 15794 1726882608.81710: variable 'ansible_search_path' from source: unknown 15794 1726882608.81807: calling self._execute() 15794 1726882608.81875: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.81893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.81914: variable 'omit' from source: magic vars 15794 1726882608.82392: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.82410: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.82421: variable 'omit' from source: magic vars 15794 1726882608.82491: variable 'omit' from source: magic vars 15794 1726882608.82714: variable '_current_interfaces' from source: set_fact 15794 1726882608.82805: variable 'omit' from source: magic vars 15794 1726882608.82897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882608.82900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882608.82925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882608.82952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.82971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.83021: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882608.83031: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.83041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.83171: Set connection var ansible_connection to ssh 15794 1726882608.83189: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882608.83201: Set connection var ansible_pipelining to False 15794 1726882608.83222: Set connection var ansible_shell_executable to /bin/sh 15794 1726882608.83225: Set connection var ansible_shell_type to sh 15794 1726882608.83332: Set connection var ansible_timeout to 10 15794 1726882608.83337: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.83340: variable 'ansible_connection' from source: unknown 15794 1726882608.83342: variable 'ansible_module_compression' from source: unknown 15794 1726882608.83345: variable 'ansible_shell_type' from source: unknown 15794 1726882608.83347: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.83349: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.83351: variable 'ansible_pipelining' from source: unknown 15794 1726882608.83353: variable 'ansible_timeout' from source: unknown 15794 1726882608.83355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.83518: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882608.83535: variable 'omit' from source: magic vars 15794 1726882608.83552: starting attempt loop 15794 1726882608.83560: running the handler 15794 1726882608.83582: handler run complete 15794 1726882608.83598: attempt loop complete, returning result 15794 1726882608.83639: _execute() done 15794 1726882608.83642: dumping result to json 15794 1726882608.83644: done dumping result, returning 15794 1726882608.83647: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affe814-3a2d-94e5-e48f-000000000116] 15794 1726882608.83649: sending task result for task 0affe814-3a2d-94e5-e48f-000000000116 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15794 1726882608.83816: no more pending results, returning what we have 15794 1726882608.83819: results queue empty 15794 1726882608.83820: checking for any_errors_fatal 15794 1726882608.83828: done checking for any_errors_fatal 15794 1726882608.83829: checking for max_fail_percentage 15794 1726882608.83832: done checking for max_fail_percentage 15794 1726882608.83833: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.83836: done checking to see if all hosts have failed 15794 1726882608.83837: getting the remaining hosts for this loop 15794 1726882608.83839: done getting the remaining hosts for this loop 15794 1726882608.83843: getting the next task for host managed_node1 15794 1726882608.83853: done getting next task for host managed_node1 15794 1726882608.83856: ^ task is: TASK: Show current_interfaces 15794 1726882608.83860: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.83866: getting variables 15794 1726882608.83868: in VariableManager get_vars() 15794 1726882608.84117: Calling all_inventory to load vars for managed_node1 15794 1726882608.84121: Calling groups_inventory to load vars for managed_node1 15794 1726882608.84125: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.84140: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.84144: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.84150: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000116 15794 1726882608.84154: WORKER PROCESS EXITING 15794 1726882608.84158: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.84632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.85175: done with get_vars() 15794 1726882608.85188: done getting variables 15794 1726882608.85368: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:48 -0400 (0:00:00.044) 0:00:06.411 ****** 15794 1726882608.85402: entering _queue_task() for managed_node1/debug 15794 1726882608.85917: worker is 1 (out of 1 available) 15794 1726882608.85931: exiting _queue_task() for managed_node1/debug 15794 1726882608.85945: done queuing things up, now waiting for results queue to drain 15794 1726882608.85947: waiting for pending results... 15794 1726882608.86162: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 15794 1726882608.86281: in run() - task 0affe814-3a2d-94e5-e48f-000000000105 15794 1726882608.86302: variable 'ansible_search_path' from source: unknown 15794 1726882608.86310: variable 'ansible_search_path' from source: unknown 15794 1726882608.86357: calling self._execute() 15794 1726882608.86447: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.86461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.86539: variable 'omit' from source: magic vars 15794 1726882608.86894: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.86915: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.86925: variable 'omit' from source: magic vars 15794 1726882608.86971: variable 'omit' from source: magic vars 15794 1726882608.87140: variable 'current_interfaces' from source: set_fact 15794 1726882608.87156: variable 'omit' from source: magic vars 15794 1726882608.87207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882608.87330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882608.87336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882608.87339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.87343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882608.87389: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882608.87399: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.87408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.87546: Set connection var ansible_connection to ssh 15794 1726882608.87564: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882608.87651: Set connection var ansible_pipelining to False 15794 1726882608.87655: Set connection var ansible_shell_executable to /bin/sh 15794 1726882608.87657: Set connection var ansible_shell_type to sh 15794 1726882608.87660: Set connection var ansible_timeout to 10 15794 1726882608.87663: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.87665: variable 'ansible_connection' from source: unknown 15794 1726882608.87667: variable 'ansible_module_compression' from source: unknown 15794 1726882608.87669: variable 'ansible_shell_type' from source: unknown 15794 1726882608.87671: variable 'ansible_shell_executable' from source: unknown 15794 1726882608.87684: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.87696: variable 'ansible_pipelining' from source: unknown 15794 1726882608.87704: variable 'ansible_timeout' from source: unknown 15794 1726882608.87713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.88099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882608.88240: variable 'omit' from source: magic vars 15794 1726882608.88439: starting attempt loop 15794 1726882608.88442: running the handler 15794 1726882608.88446: handler run complete 15794 1726882608.88449: attempt loop complete, returning result 15794 1726882608.88451: _execute() done 15794 1726882608.88453: dumping result to json 15794 1726882608.88455: done dumping result, returning 15794 1726882608.88458: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affe814-3a2d-94e5-e48f-000000000105] 15794 1726882608.88460: sending task result for task 0affe814-3a2d-94e5-e48f-000000000105 15794 1726882608.88537: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000105 15794 1726882608.88540: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15794 1726882608.88600: no more pending results, returning what we have 15794 1726882608.88604: results queue empty 15794 1726882608.88605: checking for any_errors_fatal 15794 1726882608.88613: done checking for any_errors_fatal 15794 1726882608.88614: checking for max_fail_percentage 15794 1726882608.88616: done checking for max_fail_percentage 15794 1726882608.88618: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.88619: done checking to see if all hosts have failed 15794 1726882608.88620: getting the remaining hosts for this loop 15794 1726882608.88622: done getting the remaining hosts for this loop 15794 1726882608.88627: getting the next task for host managed_node1 15794 1726882608.88639: done getting next task for host managed_node1 15794 1726882608.88644: ^ task is: TASK: Include the task 'manage_test_interface.yml' 15794 1726882608.88647: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.88656: getting variables 15794 1726882608.88658: in VariableManager get_vars() 15794 1726882608.88693: Calling all_inventory to load vars for managed_node1 15794 1726882608.88697: Calling groups_inventory to load vars for managed_node1 15794 1726882608.88702: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.88714: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.88718: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.88722: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.89994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.90525: done with get_vars() 15794 1726882608.90641: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 21:36:48 -0400 (0:00:00.054) 0:00:06.466 ****** 15794 1726882608.90849: entering _queue_task() for managed_node1/include_tasks 15794 1726882608.91213: worker is 1 (out of 1 available) 15794 1726882608.91225: exiting _queue_task() for managed_node1/include_tasks 15794 1726882608.91542: done queuing things up, now waiting for results queue to drain 15794 1726882608.91544: waiting for pending results... 15794 1726882608.91715: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 15794 1726882608.91720: in run() - task 0affe814-3a2d-94e5-e48f-000000000011 15794 1726882608.91736: variable 'ansible_search_path' from source: unknown 15794 1726882608.91782: calling self._execute() 15794 1726882608.91878: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882608.91892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882608.91909: variable 'omit' from source: magic vars 15794 1726882608.92420: variable 'ansible_distribution_major_version' from source: facts 15794 1726882608.92442: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882608.92455: _execute() done 15794 1726882608.92468: dumping result to json 15794 1726882608.92478: done dumping result, returning 15794 1726882608.92541: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0affe814-3a2d-94e5-e48f-000000000011] 15794 1726882608.92544: sending task result for task 0affe814-3a2d-94e5-e48f-000000000011 15794 1726882608.92625: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000011 15794 1726882608.92630: WORKER PROCESS EXITING 15794 1726882608.92663: no more pending results, returning what we have 15794 1726882608.92669: in VariableManager get_vars() 15794 1726882608.92707: Calling all_inventory to load vars for managed_node1 15794 1726882608.92710: Calling groups_inventory to load vars for managed_node1 15794 1726882608.92715: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.92729: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.92732: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.92738: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.93111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.93796: done with get_vars() 15794 1726882608.93804: variable 'ansible_search_path' from source: unknown 15794 1726882608.93817: we have included files to process 15794 1726882608.93818: generating all_blocks data 15794 1726882608.93820: done generating all_blocks data 15794 1726882608.93825: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 15794 1726882608.93826: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 15794 1726882608.93829: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 15794 1726882608.95214: in VariableManager get_vars() 15794 1726882608.95233: done with get_vars() 15794 1726882608.95915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15794 1726882608.97573: done processing included file 15794 1726882608.97575: iterating over new_blocks loaded from include file 15794 1726882608.97577: in VariableManager get_vars() 15794 1726882608.97593: done with get_vars() 15794 1726882608.97595: filtering new block on tags 15794 1726882608.97638: done filtering new block on tags 15794 1726882608.97641: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 15794 1726882608.97647: extending task lists for all hosts with included blocks 15794 1726882608.98089: done extending task lists 15794 1726882608.98091: done processing included files 15794 1726882608.98092: results queue empty 15794 1726882608.98093: checking for any_errors_fatal 15794 1726882608.98096: done checking for any_errors_fatal 15794 1726882608.98097: checking for max_fail_percentage 15794 1726882608.98098: done checking for max_fail_percentage 15794 1726882608.98099: checking to see if all hosts have failed and the running result is not ok 15794 1726882608.98100: done checking to see if all hosts have failed 15794 1726882608.98101: getting the remaining hosts for this loop 15794 1726882608.98102: done getting the remaining hosts for this loop 15794 1726882608.98105: getting the next task for host managed_node1 15794 1726882608.98109: done getting next task for host managed_node1 15794 1726882608.98112: ^ task is: TASK: Ensure state in ["present", "absent"] 15794 1726882608.98115: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882608.98117: getting variables 15794 1726882608.98119: in VariableManager get_vars() 15794 1726882608.98128: Calling all_inventory to load vars for managed_node1 15794 1726882608.98131: Calling groups_inventory to load vars for managed_node1 15794 1726882608.98337: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882608.98344: Calling all_plugins_play to load vars for managed_node1 15794 1726882608.98348: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882608.98352: Calling groups_plugins_play to load vars for managed_node1 15794 1726882608.98777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882608.99260: done with get_vars() 15794 1726882608.99270: done getting variables 15794 1726882608.99548: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:36:48 -0400 (0:00:00.087) 0:00:06.553 ****** 15794 1726882608.99577: entering _queue_task() for managed_node1/fail 15794 1726882608.99581: Creating lock for fail 15794 1726882609.00071: worker is 1 (out of 1 available) 15794 1726882609.00090: exiting _queue_task() for managed_node1/fail 15794 1726882609.00102: done queuing things up, now waiting for results queue to drain 15794 1726882609.00104: waiting for pending results... 15794 1726882609.00852: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 15794 1726882609.00857: in run() - task 0affe814-3a2d-94e5-e48f-000000000131 15794 1726882609.00860: variable 'ansible_search_path' from source: unknown 15794 1726882609.00863: variable 'ansible_search_path' from source: unknown 15794 1726882609.00867: calling self._execute() 15794 1726882609.01239: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.01243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.01247: variable 'omit' from source: magic vars 15794 1726882609.01941: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.01960: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.02439: variable 'state' from source: include params 15794 1726882609.02443: Evaluated conditional (state not in ["present", "absent"]): False 15794 1726882609.02445: when evaluation is False, skipping this task 15794 1726882609.02447: _execute() done 15794 1726882609.02450: dumping result to json 15794 1726882609.02453: done dumping result, returning 15794 1726882609.02455: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0affe814-3a2d-94e5-e48f-000000000131] 15794 1726882609.02458: sending task result for task 0affe814-3a2d-94e5-e48f-000000000131 15794 1726882609.02531: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000131 skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 15794 1726882609.02593: no more pending results, returning what we have 15794 1726882609.02599: results queue empty 15794 1726882609.02601: checking for any_errors_fatal 15794 1726882609.02603: done checking for any_errors_fatal 15794 1726882609.02604: checking for max_fail_percentage 15794 1726882609.02606: done checking for max_fail_percentage 15794 1726882609.02607: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.02608: done checking to see if all hosts have failed 15794 1726882609.02609: getting the remaining hosts for this loop 15794 1726882609.02611: done getting the remaining hosts for this loop 15794 1726882609.02615: getting the next task for host managed_node1 15794 1726882609.02624: done getting next task for host managed_node1 15794 1726882609.02628: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 15794 1726882609.02631: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.02638: getting variables 15794 1726882609.02640: in VariableManager get_vars() 15794 1726882609.02673: Calling all_inventory to load vars for managed_node1 15794 1726882609.02677: Calling groups_inventory to load vars for managed_node1 15794 1726882609.02684: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.02698: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.02702: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.02706: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.03362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.03850: done with get_vars() 15794 1726882609.03862: done getting variables 15794 1726882609.03893: WORKER PROCESS EXITING 15794 1726882609.03926: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:36:49 -0400 (0:00:00.043) 0:00:06.597 ****** 15794 1726882609.03957: entering _queue_task() for managed_node1/fail 15794 1726882609.04606: worker is 1 (out of 1 available) 15794 1726882609.04621: exiting _queue_task() for managed_node1/fail 15794 1726882609.04636: done queuing things up, now waiting for results queue to drain 15794 1726882609.04638: waiting for pending results... 15794 1726882609.05104: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 15794 1726882609.05273: in run() - task 0affe814-3a2d-94e5-e48f-000000000132 15794 1726882609.05295: variable 'ansible_search_path' from source: unknown 15794 1726882609.05300: variable 'ansible_search_path' from source: unknown 15794 1726882609.05339: calling self._execute() 15794 1726882609.05584: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.05588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.05840: variable 'omit' from source: magic vars 15794 1726882609.06451: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.06466: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.06883: variable 'type' from source: set_fact 15794 1726882609.06888: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 15794 1726882609.06891: when evaluation is False, skipping this task 15794 1726882609.06893: _execute() done 15794 1726882609.06899: dumping result to json 15794 1726882609.07019: done dumping result, returning 15794 1726882609.07023: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affe814-3a2d-94e5-e48f-000000000132] 15794 1726882609.07033: sending task result for task 0affe814-3a2d-94e5-e48f-000000000132 skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 15794 1726882609.07194: no more pending results, returning what we have 15794 1726882609.07199: results queue empty 15794 1726882609.07200: checking for any_errors_fatal 15794 1726882609.07208: done checking for any_errors_fatal 15794 1726882609.07209: checking for max_fail_percentage 15794 1726882609.07212: done checking for max_fail_percentage 15794 1726882609.07213: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.07213: done checking to see if all hosts have failed 15794 1726882609.07214: getting the remaining hosts for this loop 15794 1726882609.07217: done getting the remaining hosts for this loop 15794 1726882609.07222: getting the next task for host managed_node1 15794 1726882609.07231: done getting next task for host managed_node1 15794 1726882609.07237: ^ task is: TASK: Include the task 'show_interfaces.yml' 15794 1726882609.07241: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.07246: getting variables 15794 1726882609.07248: in VariableManager get_vars() 15794 1726882609.07282: Calling all_inventory to load vars for managed_node1 15794 1726882609.07285: Calling groups_inventory to load vars for managed_node1 15794 1726882609.07291: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.07304: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.07307: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.07311: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.07946: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000132 15794 1726882609.07950: WORKER PROCESS EXITING 15794 1726882609.08030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.08495: done with get_vars() 15794 1726882609.08506: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:36:49 -0400 (0:00:00.048) 0:00:06.646 ****** 15794 1726882609.08830: entering _queue_task() for managed_node1/include_tasks 15794 1726882609.09557: worker is 1 (out of 1 available) 15794 1726882609.09571: exiting _queue_task() for managed_node1/include_tasks 15794 1726882609.09589: done queuing things up, now waiting for results queue to drain 15794 1726882609.09591: waiting for pending results... 15794 1726882609.10416: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 15794 1726882609.10715: in run() - task 0affe814-3a2d-94e5-e48f-000000000133 15794 1726882609.10730: variable 'ansible_search_path' from source: unknown 15794 1726882609.10736: variable 'ansible_search_path' from source: unknown 15794 1726882609.10776: calling self._execute() 15794 1726882609.10863: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.10871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.10885: variable 'omit' from source: magic vars 15794 1726882609.11697: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.11709: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.11717: _execute() done 15794 1726882609.11722: dumping result to json 15794 1726882609.11727: done dumping result, returning 15794 1726882609.11735: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-94e5-e48f-000000000133] 15794 1726882609.11947: sending task result for task 0affe814-3a2d-94e5-e48f-000000000133 15794 1726882609.12161: no more pending results, returning what we have 15794 1726882609.12167: in VariableManager get_vars() 15794 1726882609.12202: Calling all_inventory to load vars for managed_node1 15794 1726882609.12206: Calling groups_inventory to load vars for managed_node1 15794 1726882609.12209: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.12221: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.12225: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.12229: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.12649: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000133 15794 1726882609.12652: WORKER PROCESS EXITING 15794 1726882609.12682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.13367: done with get_vars() 15794 1726882609.13376: variable 'ansible_search_path' from source: unknown 15794 1726882609.13380: variable 'ansible_search_path' from source: unknown 15794 1726882609.13422: we have included files to process 15794 1726882609.13424: generating all_blocks data 15794 1726882609.13426: done generating all_blocks data 15794 1726882609.13432: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15794 1726882609.13436: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15794 1726882609.13439: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15794 1726882609.13772: in VariableManager get_vars() 15794 1726882609.13797: done with get_vars() 15794 1726882609.14135: done processing included file 15794 1726882609.14139: iterating over new_blocks loaded from include file 15794 1726882609.14140: in VariableManager get_vars() 15794 1726882609.14157: done with get_vars() 15794 1726882609.14159: filtering new block on tags 15794 1726882609.14184: done filtering new block on tags 15794 1726882609.14186: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 15794 1726882609.14192: extending task lists for all hosts with included blocks 15794 1726882609.15391: done extending task lists 15794 1726882609.15393: done processing included files 15794 1726882609.15394: results queue empty 15794 1726882609.15395: checking for any_errors_fatal 15794 1726882609.15398: done checking for any_errors_fatal 15794 1726882609.15399: checking for max_fail_percentage 15794 1726882609.15401: done checking for max_fail_percentage 15794 1726882609.15401: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.15403: done checking to see if all hosts have failed 15794 1726882609.15403: getting the remaining hosts for this loop 15794 1726882609.15405: done getting the remaining hosts for this loop 15794 1726882609.15408: getting the next task for host managed_node1 15794 1726882609.15413: done getting next task for host managed_node1 15794 1726882609.15416: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15794 1726882609.15419: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.15422: getting variables 15794 1726882609.15423: in VariableManager get_vars() 15794 1726882609.15433: Calling all_inventory to load vars for managed_node1 15794 1726882609.15438: Calling groups_inventory to load vars for managed_node1 15794 1726882609.15441: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.15447: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.15450: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.15454: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.15855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.16541: done with get_vars() 15794 1726882609.16552: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:49 -0400 (0:00:00.078) 0:00:06.724 ****** 15794 1726882609.16644: entering _queue_task() for managed_node1/include_tasks 15794 1726882609.17286: worker is 1 (out of 1 available) 15794 1726882609.17301: exiting _queue_task() for managed_node1/include_tasks 15794 1726882609.17315: done queuing things up, now waiting for results queue to drain 15794 1726882609.17316: waiting for pending results... 15794 1726882609.17630: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 15794 1726882609.17951: in run() - task 0affe814-3a2d-94e5-e48f-00000000015c 15794 1726882609.17964: variable 'ansible_search_path' from source: unknown 15794 1726882609.17968: variable 'ansible_search_path' from source: unknown 15794 1726882609.18255: calling self._execute() 15794 1726882609.18325: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.18332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.18346: variable 'omit' from source: magic vars 15794 1726882609.19171: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.19190: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.19196: _execute() done 15794 1726882609.19201: dumping result to json 15794 1726882609.19206: done dumping result, returning 15794 1726882609.19213: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-94e5-e48f-00000000015c] 15794 1726882609.19221: sending task result for task 0affe814-3a2d-94e5-e48f-00000000015c 15794 1726882609.19328: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000015c 15794 1726882609.19331: WORKER PROCESS EXITING 15794 1726882609.19368: no more pending results, returning what we have 15794 1726882609.19375: in VariableManager get_vars() 15794 1726882609.19415: Calling all_inventory to load vars for managed_node1 15794 1726882609.19419: Calling groups_inventory to load vars for managed_node1 15794 1726882609.19423: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.19439: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.19442: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.19446: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.19884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.20571: done with get_vars() 15794 1726882609.20583: variable 'ansible_search_path' from source: unknown 15794 1726882609.20584: variable 'ansible_search_path' from source: unknown 15794 1726882609.20842: we have included files to process 15794 1726882609.20844: generating all_blocks data 15794 1726882609.20845: done generating all_blocks data 15794 1726882609.20847: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15794 1726882609.20848: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15794 1726882609.20851: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15794 1726882609.21666: done processing included file 15794 1726882609.21668: iterating over new_blocks loaded from include file 15794 1726882609.21670: in VariableManager get_vars() 15794 1726882609.21706: done with get_vars() 15794 1726882609.21709: filtering new block on tags 15794 1726882609.21733: done filtering new block on tags 15794 1726882609.22037: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 15794 1726882609.22044: extending task lists for all hosts with included blocks 15794 1726882609.22811: done extending task lists 15794 1726882609.22813: done processing included files 15794 1726882609.22814: results queue empty 15794 1726882609.22815: checking for any_errors_fatal 15794 1726882609.22819: done checking for any_errors_fatal 15794 1726882609.22821: checking for max_fail_percentage 15794 1726882609.22822: done checking for max_fail_percentage 15794 1726882609.22823: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.22824: done checking to see if all hosts have failed 15794 1726882609.22825: getting the remaining hosts for this loop 15794 1726882609.22827: done getting the remaining hosts for this loop 15794 1726882609.22830: getting the next task for host managed_node1 15794 1726882609.23041: done getting next task for host managed_node1 15794 1726882609.23045: ^ task is: TASK: Gather current interface info 15794 1726882609.23048: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.23051: getting variables 15794 1726882609.23052: in VariableManager get_vars() 15794 1726882609.23064: Calling all_inventory to load vars for managed_node1 15794 1726882609.23066: Calling groups_inventory to load vars for managed_node1 15794 1726882609.23069: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.23075: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.23080: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.23084: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.23506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.24210: done with get_vars() 15794 1726882609.24222: done getting variables 15794 1726882609.24490: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:49 -0400 (0:00:00.078) 0:00:06.803 ****** 15794 1726882609.24529: entering _queue_task() for managed_node1/command 15794 1726882609.25083: worker is 1 (out of 1 available) 15794 1726882609.25098: exiting _queue_task() for managed_node1/command 15794 1726882609.25112: done queuing things up, now waiting for results queue to drain 15794 1726882609.25113: waiting for pending results... 15794 1726882609.25776: running TaskExecutor() for managed_node1/TASK: Gather current interface info 15794 1726882609.26343: in run() - task 0affe814-3a2d-94e5-e48f-000000000193 15794 1726882609.26348: variable 'ansible_search_path' from source: unknown 15794 1726882609.26351: variable 'ansible_search_path' from source: unknown 15794 1726882609.26355: calling self._execute() 15794 1726882609.26358: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.26360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.26364: variable 'omit' from source: magic vars 15794 1726882609.27340: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.27359: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.27372: variable 'omit' from source: magic vars 15794 1726882609.27562: variable 'omit' from source: magic vars 15794 1726882609.27613: variable 'omit' from source: magic vars 15794 1726882609.27841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882609.27846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882609.27959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882609.27989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.28005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.28047: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882609.28149: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.28161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.28341: Set connection var ansible_connection to ssh 15794 1726882609.28542: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882609.28546: Set connection var ansible_pipelining to False 15794 1726882609.28548: Set connection var ansible_shell_executable to /bin/sh 15794 1726882609.28551: Set connection var ansible_shell_type to sh 15794 1726882609.28559: Set connection var ansible_timeout to 10 15794 1726882609.28603: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.28614: variable 'ansible_connection' from source: unknown 15794 1726882609.28624: variable 'ansible_module_compression' from source: unknown 15794 1726882609.28633: variable 'ansible_shell_type' from source: unknown 15794 1726882609.28657: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.28665: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.28840: variable 'ansible_pipelining' from source: unknown 15794 1726882609.28843: variable 'ansible_timeout' from source: unknown 15794 1726882609.28846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.29097: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882609.29306: variable 'omit' from source: magic vars 15794 1726882609.29310: starting attempt loop 15794 1726882609.29312: running the handler 15794 1726882609.29315: _low_level_execute_command(): starting 15794 1726882609.29317: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882609.30710: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882609.30820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.30851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.30871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.30899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.31115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.32873: stdout chunk (state=3): >>>/root <<< 15794 1726882609.32996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.33114: stderr chunk (state=3): >>><<< 15794 1726882609.33218: stdout chunk (state=3): >>><<< 15794 1726882609.33259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882609.33336: _low_level_execute_command(): starting 15794 1726882609.33350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748 `" && echo ansible-tmp-1726882609.3331864-16049-205097918034748="` echo /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748 `" ) && sleep 0' 15794 1726882609.34811: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882609.34920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.34966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.35021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.35049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.35265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.37310: stdout chunk (state=3): >>>ansible-tmp-1726882609.3331864-16049-205097918034748=/root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748 <<< 15794 1726882609.37449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.37523: stderr chunk (state=3): >>><<< 15794 1726882609.37537: stdout chunk (state=3): >>><<< 15794 1726882609.37588: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882609.3331864-16049-205097918034748=/root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882609.37632: variable 'ansible_module_compression' from source: unknown 15794 1726882609.37703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882609.37772: variable 'ansible_facts' from source: unknown 15794 1726882609.37868: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py 15794 1726882609.38060: Sending initial data 15794 1726882609.38069: Sent initial data (156 bytes) 15794 1726882609.38673: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882609.38693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882609.38743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.38759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.38781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882609.38838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.38890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.38912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.38939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.38998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.40751: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882609.40773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882609.40854: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpsfx1d50r /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py <<< 15794 1726882609.40875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py" <<< 15794 1726882609.40946: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpsfx1d50r" to remote "/root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py" <<< 15794 1726882609.41951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.41992: stderr chunk (state=3): >>><<< 15794 1726882609.41995: stdout chunk (state=3): >>><<< 15794 1726882609.42016: done transferring module to remote 15794 1726882609.42030: _low_level_execute_command(): starting 15794 1726882609.42037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/ /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py && sleep 0' 15794 1726882609.42476: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.42482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.42485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882609.42488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.42532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.42540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.42595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.44474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.44544: stderr chunk (state=3): >>><<< 15794 1726882609.44551: stdout chunk (state=3): >>><<< 15794 1726882609.44555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882609.44558: _low_level_execute_command(): starting 15794 1726882609.44566: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/AnsiballZ_command.py && sleep 0' 15794 1726882609.45023: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882609.45027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.45029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882609.45031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.45035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.45038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.45089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.45096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.45164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.62760: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:49.622310", "end": "2024-09-20 21:36:49.625923", "delta": "0:00:00.003613", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882609.64532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882609.64593: stderr chunk (state=3): >>><<< 15794 1726882609.64597: stdout chunk (state=3): >>><<< 15794 1726882609.64615: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:49.622310", "end": "2024-09-20 21:36:49.625923", "delta": "0:00:00.003613", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882609.64657: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882609.64665: _low_level_execute_command(): starting 15794 1726882609.64674: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882609.3331864-16049-205097918034748/ > /dev/null 2>&1 && sleep 0' 15794 1726882609.65115: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882609.65159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882609.65162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882609.65164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882609.65167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.65169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.65217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.65222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.65285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.67216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.67267: stderr chunk (state=3): >>><<< 15794 1726882609.67271: stdout chunk (state=3): >>><<< 15794 1726882609.67289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882609.67299: handler run complete 15794 1726882609.67321: Evaluated conditional (False): False 15794 1726882609.67332: attempt loop complete, returning result 15794 1726882609.67337: _execute() done 15794 1726882609.67343: dumping result to json 15794 1726882609.67348: done dumping result, returning 15794 1726882609.67356: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affe814-3a2d-94e5-e48f-000000000193] 15794 1726882609.67362: sending task result for task 0affe814-3a2d-94e5-e48f-000000000193 15794 1726882609.67471: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000193 15794 1726882609.67474: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003613", "end": "2024-09-20 21:36:49.625923", "rc": 0, "start": "2024-09-20 21:36:49.622310" } STDOUT: bonding_masters eth0 lo 15794 1726882609.67570: no more pending results, returning what we have 15794 1726882609.67574: results queue empty 15794 1726882609.67576: checking for any_errors_fatal 15794 1726882609.67577: done checking for any_errors_fatal 15794 1726882609.67580: checking for max_fail_percentage 15794 1726882609.67584: done checking for max_fail_percentage 15794 1726882609.67585: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.67586: done checking to see if all hosts have failed 15794 1726882609.67587: getting the remaining hosts for this loop 15794 1726882609.67589: done getting the remaining hosts for this loop 15794 1726882609.67593: getting the next task for host managed_node1 15794 1726882609.67601: done getting next task for host managed_node1 15794 1726882609.67604: ^ task is: TASK: Set current_interfaces 15794 1726882609.67609: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.67613: getting variables 15794 1726882609.67614: in VariableManager get_vars() 15794 1726882609.67651: Calling all_inventory to load vars for managed_node1 15794 1726882609.67654: Calling groups_inventory to load vars for managed_node1 15794 1726882609.67657: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.67668: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.67671: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.67674: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.67831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.67997: done with get_vars() 15794 1726882609.68007: done getting variables 15794 1726882609.68057: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:49 -0400 (0:00:00.435) 0:00:07.238 ****** 15794 1726882609.68087: entering _queue_task() for managed_node1/set_fact 15794 1726882609.68300: worker is 1 (out of 1 available) 15794 1726882609.68313: exiting _queue_task() for managed_node1/set_fact 15794 1726882609.68327: done queuing things up, now waiting for results queue to drain 15794 1726882609.68329: waiting for pending results... 15794 1726882609.68491: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 15794 1726882609.68572: in run() - task 0affe814-3a2d-94e5-e48f-000000000194 15794 1726882609.68589: variable 'ansible_search_path' from source: unknown 15794 1726882609.68593: variable 'ansible_search_path' from source: unknown 15794 1726882609.68622: calling self._execute() 15794 1726882609.68692: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.68696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.68707: variable 'omit' from source: magic vars 15794 1726882609.69011: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.69023: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.69029: variable 'omit' from source: magic vars 15794 1726882609.69074: variable 'omit' from source: magic vars 15794 1726882609.69218: variable '_current_interfaces' from source: set_fact 15794 1726882609.69267: variable 'omit' from source: magic vars 15794 1726882609.69303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882609.69337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882609.69355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882609.69370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.69383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.69408: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882609.69412: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.69417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.69500: Set connection var ansible_connection to ssh 15794 1726882609.69507: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882609.69514: Set connection var ansible_pipelining to False 15794 1726882609.69521: Set connection var ansible_shell_executable to /bin/sh 15794 1726882609.69524: Set connection var ansible_shell_type to sh 15794 1726882609.69532: Set connection var ansible_timeout to 10 15794 1726882609.69563: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.69566: variable 'ansible_connection' from source: unknown 15794 1726882609.69569: variable 'ansible_module_compression' from source: unknown 15794 1726882609.69572: variable 'ansible_shell_type' from source: unknown 15794 1726882609.69576: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.69582: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.69585: variable 'ansible_pipelining' from source: unknown 15794 1726882609.69588: variable 'ansible_timeout' from source: unknown 15794 1726882609.69593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.69713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882609.69722: variable 'omit' from source: magic vars 15794 1726882609.69729: starting attempt loop 15794 1726882609.69732: running the handler 15794 1726882609.69744: handler run complete 15794 1726882609.69755: attempt loop complete, returning result 15794 1726882609.69758: _execute() done 15794 1726882609.69761: dumping result to json 15794 1726882609.69773: done dumping result, returning 15794 1726882609.69777: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affe814-3a2d-94e5-e48f-000000000194] 15794 1726882609.69782: sending task result for task 0affe814-3a2d-94e5-e48f-000000000194 15794 1726882609.69863: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000194 15794 1726882609.69866: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15794 1726882609.69935: no more pending results, returning what we have 15794 1726882609.69939: results queue empty 15794 1726882609.69940: checking for any_errors_fatal 15794 1726882609.69946: done checking for any_errors_fatal 15794 1726882609.69947: checking for max_fail_percentage 15794 1726882609.69949: done checking for max_fail_percentage 15794 1726882609.69950: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.69951: done checking to see if all hosts have failed 15794 1726882609.69952: getting the remaining hosts for this loop 15794 1726882609.69954: done getting the remaining hosts for this loop 15794 1726882609.69957: getting the next task for host managed_node1 15794 1726882609.69966: done getting next task for host managed_node1 15794 1726882609.69969: ^ task is: TASK: Show current_interfaces 15794 1726882609.69972: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.69977: getting variables 15794 1726882609.69980: in VariableManager get_vars() 15794 1726882609.70006: Calling all_inventory to load vars for managed_node1 15794 1726882609.70009: Calling groups_inventory to load vars for managed_node1 15794 1726882609.70013: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.70023: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.70025: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.70027: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.70196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.70355: done with get_vars() 15794 1726882609.70364: done getting variables 15794 1726882609.70409: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:49 -0400 (0:00:00.023) 0:00:07.262 ****** 15794 1726882609.70432: entering _queue_task() for managed_node1/debug 15794 1726882609.70635: worker is 1 (out of 1 available) 15794 1726882609.70650: exiting _queue_task() for managed_node1/debug 15794 1726882609.70662: done queuing things up, now waiting for results queue to drain 15794 1726882609.70664: waiting for pending results... 15794 1726882609.70815: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 15794 1726882609.70891: in run() - task 0affe814-3a2d-94e5-e48f-00000000015d 15794 1726882609.70906: variable 'ansible_search_path' from source: unknown 15794 1726882609.70910: variable 'ansible_search_path' from source: unknown 15794 1726882609.70942: calling self._execute() 15794 1726882609.71005: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.71009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.71021: variable 'omit' from source: magic vars 15794 1726882609.71314: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.71323: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.71330: variable 'omit' from source: magic vars 15794 1726882609.71372: variable 'omit' from source: magic vars 15794 1726882609.71452: variable 'current_interfaces' from source: set_fact 15794 1726882609.71476: variable 'omit' from source: magic vars 15794 1726882609.71510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882609.71539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882609.71557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882609.71575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.71587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.71613: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882609.71617: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.71622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.71703: Set connection var ansible_connection to ssh 15794 1726882609.71712: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882609.71719: Set connection var ansible_pipelining to False 15794 1726882609.71726: Set connection var ansible_shell_executable to /bin/sh 15794 1726882609.71728: Set connection var ansible_shell_type to sh 15794 1726882609.71739: Set connection var ansible_timeout to 10 15794 1726882609.71763: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.71767: variable 'ansible_connection' from source: unknown 15794 1726882609.71770: variable 'ansible_module_compression' from source: unknown 15794 1726882609.71774: variable 'ansible_shell_type' from source: unknown 15794 1726882609.71780: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.71783: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.71786: variable 'ansible_pipelining' from source: unknown 15794 1726882609.71798: variable 'ansible_timeout' from source: unknown 15794 1726882609.71800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.71909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882609.71920: variable 'omit' from source: magic vars 15794 1726882609.71926: starting attempt loop 15794 1726882609.71929: running the handler 15794 1726882609.71969: handler run complete 15794 1726882609.71983: attempt loop complete, returning result 15794 1726882609.71987: _execute() done 15794 1726882609.71990: dumping result to json 15794 1726882609.71994: done dumping result, returning 15794 1726882609.72002: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affe814-3a2d-94e5-e48f-00000000015d] 15794 1726882609.72009: sending task result for task 0affe814-3a2d-94e5-e48f-00000000015d 15794 1726882609.72098: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000015d 15794 1726882609.72101: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15794 1726882609.72168: no more pending results, returning what we have 15794 1726882609.72171: results queue empty 15794 1726882609.72172: checking for any_errors_fatal 15794 1726882609.72176: done checking for any_errors_fatal 15794 1726882609.72177: checking for max_fail_percentage 15794 1726882609.72181: done checking for max_fail_percentage 15794 1726882609.72182: checking to see if all hosts have failed and the running result is not ok 15794 1726882609.72183: done checking to see if all hosts have failed 15794 1726882609.72183: getting the remaining hosts for this loop 15794 1726882609.72185: done getting the remaining hosts for this loop 15794 1726882609.72189: getting the next task for host managed_node1 15794 1726882609.72197: done getting next task for host managed_node1 15794 1726882609.72200: ^ task is: TASK: Install iproute 15794 1726882609.72203: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882609.72207: getting variables 15794 1726882609.72208: in VariableManager get_vars() 15794 1726882609.72233: Calling all_inventory to load vars for managed_node1 15794 1726882609.72236: Calling groups_inventory to load vars for managed_node1 15794 1726882609.72241: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882609.72248: Calling all_plugins_play to load vars for managed_node1 15794 1726882609.72250: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882609.72253: Calling groups_plugins_play to load vars for managed_node1 15794 1726882609.72388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882609.72544: done with get_vars() 15794 1726882609.72553: done getting variables 15794 1726882609.72599: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:36:49 -0400 (0:00:00.021) 0:00:07.284 ****** 15794 1726882609.72621: entering _queue_task() for managed_node1/package 15794 1726882609.72812: worker is 1 (out of 1 available) 15794 1726882609.72825: exiting _queue_task() for managed_node1/package 15794 1726882609.72841: done queuing things up, now waiting for results queue to drain 15794 1726882609.72842: waiting for pending results... 15794 1726882609.72993: running TaskExecutor() for managed_node1/TASK: Install iproute 15794 1726882609.73052: in run() - task 0affe814-3a2d-94e5-e48f-000000000134 15794 1726882609.73065: variable 'ansible_search_path' from source: unknown 15794 1726882609.73070: variable 'ansible_search_path' from source: unknown 15794 1726882609.73102: calling self._execute() 15794 1726882609.73168: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.73174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.73189: variable 'omit' from source: magic vars 15794 1726882609.73552: variable 'ansible_distribution_major_version' from source: facts 15794 1726882609.73562: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882609.73569: variable 'omit' from source: magic vars 15794 1726882609.73603: variable 'omit' from source: magic vars 15794 1726882609.73768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882609.75555: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882609.75612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882609.75643: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882609.75689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882609.75713: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882609.75795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882609.75822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882609.75846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882609.75882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882609.75897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882609.75979: variable '__network_is_ostree' from source: set_fact 15794 1726882609.75986: variable 'omit' from source: magic vars 15794 1726882609.76011: variable 'omit' from source: magic vars 15794 1726882609.76040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882609.76063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882609.76078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882609.76097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.76106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882609.76135: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882609.76138: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.76145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.76222: Set connection var ansible_connection to ssh 15794 1726882609.76229: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882609.76240: Set connection var ansible_pipelining to False 15794 1726882609.76249: Set connection var ansible_shell_executable to /bin/sh 15794 1726882609.76252: Set connection var ansible_shell_type to sh 15794 1726882609.76260: Set connection var ansible_timeout to 10 15794 1726882609.76287: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.76291: variable 'ansible_connection' from source: unknown 15794 1726882609.76293: variable 'ansible_module_compression' from source: unknown 15794 1726882609.76296: variable 'ansible_shell_type' from source: unknown 15794 1726882609.76299: variable 'ansible_shell_executable' from source: unknown 15794 1726882609.76304: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882609.76309: variable 'ansible_pipelining' from source: unknown 15794 1726882609.76313: variable 'ansible_timeout' from source: unknown 15794 1726882609.76318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882609.76405: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882609.76414: variable 'omit' from source: magic vars 15794 1726882609.76420: starting attempt loop 15794 1726882609.76423: running the handler 15794 1726882609.76430: variable 'ansible_facts' from source: unknown 15794 1726882609.76433: variable 'ansible_facts' from source: unknown 15794 1726882609.76467: _low_level_execute_command(): starting 15794 1726882609.76474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882609.76999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.77005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.77009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882609.77012: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.77068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.77072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.77074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.77146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.78912: stdout chunk (state=3): >>>/root <<< 15794 1726882609.79020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.79077: stderr chunk (state=3): >>><<< 15794 1726882609.79083: stdout chunk (state=3): >>><<< 15794 1726882609.79102: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882609.79112: _low_level_execute_command(): starting 15794 1726882609.79119: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524 `" && echo ansible-tmp-1726882609.791008-16077-36451687404524="` echo /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524 `" ) && sleep 0' 15794 1726882609.79577: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882609.79583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882609.79586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882609.79588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882609.79591: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.79643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.79652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.79710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.81736: stdout chunk (state=3): >>>ansible-tmp-1726882609.791008-16077-36451687404524=/root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524 <<< 15794 1726882609.81862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.81909: stderr chunk (state=3): >>><<< 15794 1726882609.81912: stdout chunk (state=3): >>><<< 15794 1726882609.81926: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882609.791008-16077-36451687404524=/root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882609.81956: variable 'ansible_module_compression' from source: unknown 15794 1726882609.82009: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 15794 1726882609.82013: ANSIBALLZ: Acquiring lock 15794 1726882609.82015: ANSIBALLZ: Lock acquired: 139758818400528 15794 1726882609.82018: ANSIBALLZ: Creating module 15794 1726882609.95382: ANSIBALLZ: Writing module into payload 15794 1726882609.95571: ANSIBALLZ: Writing module 15794 1726882609.95594: ANSIBALLZ: Renaming module 15794 1726882609.95597: ANSIBALLZ: Done creating module 15794 1726882609.95618: variable 'ansible_facts' from source: unknown 15794 1726882609.95676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py 15794 1726882609.95793: Sending initial data 15794 1726882609.95797: Sent initial data (150 bytes) 15794 1726882609.96296: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.96299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.96302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882609.96304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882609.96361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882609.96366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882609.96369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882609.96441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882609.98147: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882609.98201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882609.98263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpzw3vz9d4 /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py <<< 15794 1726882609.98266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py" <<< 15794 1726882609.98316: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpzw3vz9d4" to remote "/root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py" <<< 15794 1726882609.99408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882609.99480: stderr chunk (state=3): >>><<< 15794 1726882609.99484: stdout chunk (state=3): >>><<< 15794 1726882609.99505: done transferring module to remote 15794 1726882609.99515: _low_level_execute_command(): starting 15794 1726882609.99524: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/ /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py && sleep 0' 15794 1726882610.00009: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882610.00012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882610.00015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882610.00017: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882610.00019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882610.00068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882610.00072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882610.00138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882610.01995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882610.02044: stderr chunk (state=3): >>><<< 15794 1726882610.02052: stdout chunk (state=3): >>><<< 15794 1726882610.02065: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882610.02068: _low_level_execute_command(): starting 15794 1726882610.02076: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/AnsiballZ_dnf.py && sleep 0' 15794 1726882610.02539: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882610.02543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882610.02545: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882610.02548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882610.02550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882610.02602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882610.02607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882610.02668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.50285: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 15794 1726882611.55204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882611.55264: stderr chunk (state=3): >>><<< 15794 1726882611.55267: stdout chunk (state=3): >>><<< 15794 1726882611.55308: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882611.55344: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882611.55351: _low_level_execute_command(): starting 15794 1726882611.55357: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882609.791008-16077-36451687404524/ > /dev/null 2>&1 && sleep 0' 15794 1726882611.55810: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882611.55814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.55817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.55819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.55870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882611.55876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.55943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.57864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882611.57907: stderr chunk (state=3): >>><<< 15794 1726882611.57911: stdout chunk (state=3): >>><<< 15794 1726882611.57924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882611.57936: handler run complete 15794 1726882611.58076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882611.58224: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882611.58260: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882611.58290: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882611.58317: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882611.58376: variable '__install_status' from source: unknown 15794 1726882611.58399: Evaluated conditional (__install_status is success): True 15794 1726882611.58414: attempt loop complete, returning result 15794 1726882611.58417: _execute() done 15794 1726882611.58419: dumping result to json 15794 1726882611.58426: done dumping result, returning 15794 1726882611.58435: done running TaskExecutor() for managed_node1/TASK: Install iproute [0affe814-3a2d-94e5-e48f-000000000134] 15794 1726882611.58441: sending task result for task 0affe814-3a2d-94e5-e48f-000000000134 15794 1726882611.58547: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000134 15794 1726882611.58550: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 15794 1726882611.58655: no more pending results, returning what we have 15794 1726882611.58659: results queue empty 15794 1726882611.58660: checking for any_errors_fatal 15794 1726882611.58667: done checking for any_errors_fatal 15794 1726882611.58668: checking for max_fail_percentage 15794 1726882611.58670: done checking for max_fail_percentage 15794 1726882611.58671: checking to see if all hosts have failed and the running result is not ok 15794 1726882611.58672: done checking to see if all hosts have failed 15794 1726882611.58673: getting the remaining hosts for this loop 15794 1726882611.58675: done getting the remaining hosts for this loop 15794 1726882611.58681: getting the next task for host managed_node1 15794 1726882611.58688: done getting next task for host managed_node1 15794 1726882611.58692: ^ task is: TASK: Create veth interface {{ interface }} 15794 1726882611.58695: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882611.58698: getting variables 15794 1726882611.58700: in VariableManager get_vars() 15794 1726882611.58730: Calling all_inventory to load vars for managed_node1 15794 1726882611.58742: Calling groups_inventory to load vars for managed_node1 15794 1726882611.58800: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882611.58811: Calling all_plugins_play to load vars for managed_node1 15794 1726882611.58815: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882611.58818: Calling groups_plugins_play to load vars for managed_node1 15794 1726882611.58974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882611.59136: done with get_vars() 15794 1726882611.59145: done getting variables 15794 1726882611.59198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882611.59297: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:36:51 -0400 (0:00:01.867) 0:00:09.151 ****** 15794 1726882611.59324: entering _queue_task() for managed_node1/command 15794 1726882611.59541: worker is 1 (out of 1 available) 15794 1726882611.59554: exiting _queue_task() for managed_node1/command 15794 1726882611.59566: done queuing things up, now waiting for results queue to drain 15794 1726882611.59567: waiting for pending results... 15794 1726882611.59951: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 15794 1726882611.59964: in run() - task 0affe814-3a2d-94e5-e48f-000000000135 15794 1726882611.59968: variable 'ansible_search_path' from source: unknown 15794 1726882611.59971: variable 'ansible_search_path' from source: unknown 15794 1726882611.60255: variable 'interface' from source: set_fact 15794 1726882611.60373: variable 'interface' from source: set_fact 15794 1726882611.60490: variable 'interface' from source: set_fact 15794 1726882611.60677: Loaded config def from plugin (lookup/items) 15794 1726882611.60694: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 15794 1726882611.60741: variable 'omit' from source: magic vars 15794 1726882611.60880: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882611.60942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882611.60950: variable 'omit' from source: magic vars 15794 1726882611.61193: variable 'ansible_distribution_major_version' from source: facts 15794 1726882611.61201: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882611.61377: variable 'type' from source: set_fact 15794 1726882611.61392: variable 'state' from source: include params 15794 1726882611.61399: variable 'interface' from source: set_fact 15794 1726882611.61402: variable 'current_interfaces' from source: set_fact 15794 1726882611.61404: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 15794 1726882611.61410: variable 'omit' from source: magic vars 15794 1726882611.61441: variable 'omit' from source: magic vars 15794 1726882611.61478: variable 'item' from source: unknown 15794 1726882611.61542: variable 'item' from source: unknown 15794 1726882611.61556: variable 'omit' from source: magic vars 15794 1726882611.61583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882611.61614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882611.61629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882611.61647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882611.61656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882611.61685: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882611.61690: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882611.61693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882611.61776: Set connection var ansible_connection to ssh 15794 1726882611.61786: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882611.61793: Set connection var ansible_pipelining to False 15794 1726882611.61800: Set connection var ansible_shell_executable to /bin/sh 15794 1726882611.61803: Set connection var ansible_shell_type to sh 15794 1726882611.61812: Set connection var ansible_timeout to 10 15794 1726882611.61837: variable 'ansible_shell_executable' from source: unknown 15794 1726882611.61841: variable 'ansible_connection' from source: unknown 15794 1726882611.61845: variable 'ansible_module_compression' from source: unknown 15794 1726882611.61848: variable 'ansible_shell_type' from source: unknown 15794 1726882611.61850: variable 'ansible_shell_executable' from source: unknown 15794 1726882611.61855: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882611.61860: variable 'ansible_pipelining' from source: unknown 15794 1726882611.61863: variable 'ansible_timeout' from source: unknown 15794 1726882611.61869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882611.61991: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882611.62000: variable 'omit' from source: magic vars 15794 1726882611.62006: starting attempt loop 15794 1726882611.62009: running the handler 15794 1726882611.62022: _low_level_execute_command(): starting 15794 1726882611.62030: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882611.62507: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.62546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882611.62549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882611.62552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.62610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882611.62613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882611.62614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.62668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.64395: stdout chunk (state=3): >>>/root <<< 15794 1726882611.64605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882611.64608: stdout chunk (state=3): >>><<< 15794 1726882611.64610: stderr chunk (state=3): >>><<< 15794 1726882611.64629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882611.64655: _low_level_execute_command(): starting 15794 1726882611.64665: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571 `" && echo ansible-tmp-1726882611.646385-16115-165661201908571="` echo /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571 `" ) && sleep 0' 15794 1726882611.65157: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.65161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882611.65191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.65196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.65254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882611.65259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.65313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.67353: stdout chunk (state=3): >>>ansible-tmp-1726882611.646385-16115-165661201908571=/root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571 <<< 15794 1726882611.67462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882611.67507: stderr chunk (state=3): >>><<< 15794 1726882611.67510: stdout chunk (state=3): >>><<< 15794 1726882611.67525: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882611.646385-16115-165661201908571=/root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882611.67553: variable 'ansible_module_compression' from source: unknown 15794 1726882611.67596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882611.67621: variable 'ansible_facts' from source: unknown 15794 1726882611.67694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py 15794 1726882611.67791: Sending initial data 15794 1726882611.67797: Sent initial data (155 bytes) 15794 1726882611.68431: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.68480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.70121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15794 1726882611.70131: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882611.70176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882611.70232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmphlip9zb9 /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py <<< 15794 1726882611.70243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py" <<< 15794 1726882611.70287: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmphlip9zb9" to remote "/root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py" <<< 15794 1726882611.71135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882611.71189: stderr chunk (state=3): >>><<< 15794 1726882611.71196: stdout chunk (state=3): >>><<< 15794 1726882611.71211: done transferring module to remote 15794 1726882611.71221: _low_level_execute_command(): starting 15794 1726882611.71226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/ /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py && sleep 0' 15794 1726882611.71613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.71653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882611.71660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.71663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.71665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.71710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882611.71717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.71773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.73616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882611.73657: stderr chunk (state=3): >>><<< 15794 1726882611.73660: stdout chunk (state=3): >>><<< 15794 1726882611.73673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882611.73676: _low_level_execute_command(): starting 15794 1726882611.73688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/AnsiballZ_command.py && sleep 0' 15794 1726882611.74096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.74100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.74102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882611.74104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.74163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882611.74166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.74229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882611.92224: stdout chunk (state=3): >>> <<< 15794 1726882611.92252: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:36:51.912172", "end": "2024-09-20 21:36:51.920352", "delta": "0:00:00.008180", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882611.95277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882611.95323: stderr chunk (state=3): >>><<< 15794 1726882611.95327: stdout chunk (state=3): >>><<< 15794 1726882611.95347: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:36:51.912172", "end": "2024-09-20 21:36:51.920352", "delta": "0:00:00.008180", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882611.95415: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882611.95423: _low_level_execute_command(): starting 15794 1726882611.95429: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882611.646385-16115-165661201908571/ > /dev/null 2>&1 && sleep 0' 15794 1726882611.95893: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882611.95896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882611.95899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882611.95901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882611.95903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882611.95952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882611.95959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882611.96024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.01200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.01242: stderr chunk (state=3): >>><<< 15794 1726882612.01247: stdout chunk (state=3): >>><<< 15794 1726882612.01260: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.01271: handler run complete 15794 1726882612.01295: Evaluated conditional (False): False 15794 1726882612.01305: attempt loop complete, returning result 15794 1726882612.01322: variable 'item' from source: unknown 15794 1726882612.01393: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.008180", "end": "2024-09-20 21:36:51.920352", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 21:36:51.912172" } 15794 1726882612.01576: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.01582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.01585: variable 'omit' from source: magic vars 15794 1726882612.01685: variable 'ansible_distribution_major_version' from source: facts 15794 1726882612.01689: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882612.01846: variable 'type' from source: set_fact 15794 1726882612.01849: variable 'state' from source: include params 15794 1726882612.01855: variable 'interface' from source: set_fact 15794 1726882612.01860: variable 'current_interfaces' from source: set_fact 15794 1726882612.01866: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 15794 1726882612.01872: variable 'omit' from source: magic vars 15794 1726882612.01886: variable 'omit' from source: magic vars 15794 1726882612.01920: variable 'item' from source: unknown 15794 1726882612.01972: variable 'item' from source: unknown 15794 1726882612.01986: variable 'omit' from source: magic vars 15794 1726882612.02004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882612.02013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882612.02019: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882612.02035: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882612.02038: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.02047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.02105: Set connection var ansible_connection to ssh 15794 1726882612.02111: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882612.02118: Set connection var ansible_pipelining to False 15794 1726882612.02124: Set connection var ansible_shell_executable to /bin/sh 15794 1726882612.02127: Set connection var ansible_shell_type to sh 15794 1726882612.02141: Set connection var ansible_timeout to 10 15794 1726882612.02163: variable 'ansible_shell_executable' from source: unknown 15794 1726882612.02166: variable 'ansible_connection' from source: unknown 15794 1726882612.02169: variable 'ansible_module_compression' from source: unknown 15794 1726882612.02173: variable 'ansible_shell_type' from source: unknown 15794 1726882612.02176: variable 'ansible_shell_executable' from source: unknown 15794 1726882612.02182: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.02185: variable 'ansible_pipelining' from source: unknown 15794 1726882612.02190: variable 'ansible_timeout' from source: unknown 15794 1726882612.02195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.02276: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882612.02286: variable 'omit' from source: magic vars 15794 1726882612.02291: starting attempt loop 15794 1726882612.02294: running the handler 15794 1726882612.02302: _low_level_execute_command(): starting 15794 1726882612.02306: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882612.02721: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.02752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.02755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.02758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.02764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.02818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.02823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.02883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.04601: stdout chunk (state=3): >>>/root <<< 15794 1726882612.04708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.04756: stderr chunk (state=3): >>><<< 15794 1726882612.04760: stdout chunk (state=3): >>><<< 15794 1726882612.04772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.04782: _low_level_execute_command(): starting 15794 1726882612.04788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104 `" && echo ansible-tmp-1726882612.0477178-16115-153009180792104="` echo /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104 `" ) && sleep 0' 15794 1726882612.05226: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.05229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.05232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.05236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.05292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.05297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.05357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.07350: stdout chunk (state=3): >>>ansible-tmp-1726882612.0477178-16115-153009180792104=/root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104 <<< 15794 1726882612.07473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.07517: stderr chunk (state=3): >>><<< 15794 1726882612.07521: stdout chunk (state=3): >>><<< 15794 1726882612.07535: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882612.0477178-16115-153009180792104=/root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.07555: variable 'ansible_module_compression' from source: unknown 15794 1726882612.07585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882612.07603: variable 'ansible_facts' from source: unknown 15794 1726882612.07656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py 15794 1726882612.07753: Sending initial data 15794 1726882612.07757: Sent initial data (156 bytes) 15794 1726882612.08193: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.08198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.08201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882612.08203: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882612.08205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.08267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.08270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.08326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.09950: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882612.10005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882612.10059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpi1_688wg /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py <<< 15794 1726882612.10068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py" <<< 15794 1726882612.10113: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpi1_688wg" to remote "/root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py" <<< 15794 1726882612.10969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.11022: stderr chunk (state=3): >>><<< 15794 1726882612.11026: stdout chunk (state=3): >>><<< 15794 1726882612.11044: done transferring module to remote 15794 1726882612.11052: _low_level_execute_command(): starting 15794 1726882612.11059: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/ /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py && sleep 0' 15794 1726882612.11487: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.11493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.11495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882612.11497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882612.11500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.11558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.11561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.11617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.13465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.13507: stderr chunk (state=3): >>><<< 15794 1726882612.13511: stdout chunk (state=3): >>><<< 15794 1726882612.13523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.13527: _low_level_execute_command(): starting 15794 1726882612.13532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/AnsiballZ_command.py && sleep 0' 15794 1726882612.13923: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.13960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882612.13963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.13966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882612.13969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.14020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.14027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.14088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.31722: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:36:52.311630", "end": "2024-09-20 21:36:52.315315", "delta": "0:00:00.003685", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882612.33363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882612.33424: stderr chunk (state=3): >>><<< 15794 1726882612.33427: stdout chunk (state=3): >>><<< 15794 1726882612.33445: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:36:52.311630", "end": "2024-09-20 21:36:52.315315", "delta": "0:00:00.003685", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882612.33476: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882612.33488: _low_level_execute_command(): starting 15794 1726882612.33494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882612.0477178-16115-153009180792104/ > /dev/null 2>&1 && sleep 0' 15794 1726882612.33940: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.33976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882612.33982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.33985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.33987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.33989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.34040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.34047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.34106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.36142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.36146: stdout chunk (state=3): >>><<< 15794 1726882612.36148: stderr chunk (state=3): >>><<< 15794 1726882612.36342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.36345: handler run complete 15794 1726882612.36348: Evaluated conditional (False): False 15794 1726882612.36350: attempt loop complete, returning result 15794 1726882612.36352: variable 'item' from source: unknown 15794 1726882612.36354: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003685", "end": "2024-09-20 21:36:52.315315", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 21:36:52.311630" } 15794 1726882612.36624: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.36643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.36690: variable 'omit' from source: magic vars 15794 1726882612.36909: variable 'ansible_distribution_major_version' from source: facts 15794 1726882612.36928: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882612.37211: variable 'type' from source: set_fact 15794 1726882612.37235: variable 'state' from source: include params 15794 1726882612.37344: variable 'interface' from source: set_fact 15794 1726882612.37347: variable 'current_interfaces' from source: set_fact 15794 1726882612.37350: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 15794 1726882612.37352: variable 'omit' from source: magic vars 15794 1726882612.37354: variable 'omit' from source: magic vars 15794 1726882612.37357: variable 'item' from source: unknown 15794 1726882612.37438: variable 'item' from source: unknown 15794 1726882612.37473: variable 'omit' from source: magic vars 15794 1726882612.37506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882612.37520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882612.37532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882612.37581: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882612.37743: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.37746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.38070: Set connection var ansible_connection to ssh 15794 1726882612.38073: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882612.38076: Set connection var ansible_pipelining to False 15794 1726882612.38081: Set connection var ansible_shell_executable to /bin/sh 15794 1726882612.38091: Set connection var ansible_shell_type to sh 15794 1726882612.38217: Set connection var ansible_timeout to 10 15794 1726882612.38238: variable 'ansible_shell_executable' from source: unknown 15794 1726882612.38294: variable 'ansible_connection' from source: unknown 15794 1726882612.38304: variable 'ansible_module_compression' from source: unknown 15794 1726882612.38313: variable 'ansible_shell_type' from source: unknown 15794 1726882612.38333: variable 'ansible_shell_executable' from source: unknown 15794 1726882612.38404: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.38549: variable 'ansible_pipelining' from source: unknown 15794 1726882612.38556: variable 'ansible_timeout' from source: unknown 15794 1726882612.38559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.38699: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882612.38766: variable 'omit' from source: magic vars 15794 1726882612.38770: starting attempt loop 15794 1726882612.38772: running the handler 15794 1726882612.38775: _low_level_execute_command(): starting 15794 1726882612.38785: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882612.39558: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.39668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.39671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.39719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.39812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.41682: stdout chunk (state=3): >>>/root <<< 15794 1726882612.41765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.41795: stdout chunk (state=3): >>><<< 15794 1726882612.41798: stderr chunk (state=3): >>><<< 15794 1726882612.41814: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.41828: _low_level_execute_command(): starting 15794 1726882612.41844: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205 `" && echo ansible-tmp-1726882612.418192-16115-125085775697205="` echo /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205 `" ) && sleep 0' 15794 1726882612.42805: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.42811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882612.42827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.42837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.42841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882612.42857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.42865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882612.42952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882612.42956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.42958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.42983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.42988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.43082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.45068: stdout chunk (state=3): >>>ansible-tmp-1726882612.418192-16115-125085775697205=/root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205 <<< 15794 1726882612.45285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.45290: stdout chunk (state=3): >>><<< 15794 1726882612.45292: stderr chunk (state=3): >>><<< 15794 1726882612.45441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882612.418192-16115-125085775697205=/root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.45444: variable 'ansible_module_compression' from source: unknown 15794 1726882612.45447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882612.45449: variable 'ansible_facts' from source: unknown 15794 1726882612.45498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py 15794 1726882612.45657: Sending initial data 15794 1726882612.45676: Sent initial data (155 bytes) 15794 1726882612.46445: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.46491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.46509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.46527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.46615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.48249: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882612.48316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882612.48383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp9x96u466 /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py <<< 15794 1726882612.48386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py" <<< 15794 1726882612.48432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp9x96u466" to remote "/root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py" <<< 15794 1726882612.49797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.49801: stdout chunk (state=3): >>><<< 15794 1726882612.49803: stderr chunk (state=3): >>><<< 15794 1726882612.49805: done transferring module to remote 15794 1726882612.49814: _low_level_execute_command(): starting 15794 1726882612.49824: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/ /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py && sleep 0' 15794 1726882612.50454: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.50457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.50460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.50462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882612.50464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.50526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.50530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.50607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.52528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.52531: stdout chunk (state=3): >>><<< 15794 1726882612.52536: stderr chunk (state=3): >>><<< 15794 1726882612.52552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.52560: _low_level_execute_command(): starting 15794 1726882612.52647: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/AnsiballZ_command.py && sleep 0' 15794 1726882612.53251: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.53322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.53341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.53389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.53459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.71452: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:36:52.708437", "end": "2024-09-20 21:36:52.712293", "delta": "0:00:00.003856", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882612.73164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882612.73239: stderr chunk (state=3): >>><<< 15794 1726882612.73263: stdout chunk (state=3): >>><<< 15794 1726882612.73436: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:36:52.708437", "end": "2024-09-20 21:36:52.712293", "delta": "0:00:00.003856", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882612.73441: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882612.73443: _low_level_execute_command(): starting 15794 1726882612.73446: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882612.418192-16115-125085775697205/ > /dev/null 2>&1 && sleep 0' 15794 1726882612.74126: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.74169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.74188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.74210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.74302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.76314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.76327: stdout chunk (state=3): >>><<< 15794 1726882612.76343: stderr chunk (state=3): >>><<< 15794 1726882612.76363: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.76374: handler run complete 15794 1726882612.76410: Evaluated conditional (False): False 15794 1726882612.76430: attempt loop complete, returning result 15794 1726882612.76539: variable 'item' from source: unknown 15794 1726882612.76570: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003856", "end": "2024-09-20 21:36:52.712293", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 21:36:52.708437" } 15794 1726882612.77061: dumping result to json 15794 1726882612.77065: done dumping result, returning 15794 1726882612.77067: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 [0affe814-3a2d-94e5-e48f-000000000135] 15794 1726882612.77069: sending task result for task 0affe814-3a2d-94e5-e48f-000000000135 15794 1726882612.77126: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000135 15794 1726882612.77129: WORKER PROCESS EXITING 15794 1726882612.77228: no more pending results, returning what we have 15794 1726882612.77232: results queue empty 15794 1726882612.77240: checking for any_errors_fatal 15794 1726882612.77247: done checking for any_errors_fatal 15794 1726882612.77248: checking for max_fail_percentage 15794 1726882612.77250: done checking for max_fail_percentage 15794 1726882612.77251: checking to see if all hosts have failed and the running result is not ok 15794 1726882612.77252: done checking to see if all hosts have failed 15794 1726882612.77253: getting the remaining hosts for this loop 15794 1726882612.77256: done getting the remaining hosts for this loop 15794 1726882612.77260: getting the next task for host managed_node1 15794 1726882612.77267: done getting next task for host managed_node1 15794 1726882612.77270: ^ task is: TASK: Set up veth as managed by NetworkManager 15794 1726882612.77273: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882612.77283: getting variables 15794 1726882612.77285: in VariableManager get_vars() 15794 1726882612.77319: Calling all_inventory to load vars for managed_node1 15794 1726882612.77322: Calling groups_inventory to load vars for managed_node1 15794 1726882612.77327: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882612.77642: Calling all_plugins_play to load vars for managed_node1 15794 1726882612.77647: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882612.77652: Calling groups_plugins_play to load vars for managed_node1 15794 1726882612.78225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882612.78845: done with get_vars() 15794 1726882612.78942: done getting variables 15794 1726882612.79015: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:36:52 -0400 (0:00:01.197) 0:00:10.348 ****** 15794 1726882612.79049: entering _queue_task() for managed_node1/command 15794 1726882612.79665: worker is 1 (out of 1 available) 15794 1726882612.79682: exiting _queue_task() for managed_node1/command 15794 1726882612.79698: done queuing things up, now waiting for results queue to drain 15794 1726882612.79700: waiting for pending results... 15794 1726882612.80074: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 15794 1726882612.80287: in run() - task 0affe814-3a2d-94e5-e48f-000000000136 15794 1726882612.80291: variable 'ansible_search_path' from source: unknown 15794 1726882612.80293: variable 'ansible_search_path' from source: unknown 15794 1726882612.80296: calling self._execute() 15794 1726882612.80376: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.80400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.80419: variable 'omit' from source: magic vars 15794 1726882612.80885: variable 'ansible_distribution_major_version' from source: facts 15794 1726882612.80907: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882612.81133: variable 'type' from source: set_fact 15794 1726882612.81157: variable 'state' from source: include params 15794 1726882612.81170: Evaluated conditional (type == 'veth' and state == 'present'): True 15794 1726882612.81239: variable 'omit' from source: magic vars 15794 1726882612.81242: variable 'omit' from source: magic vars 15794 1726882612.81470: variable 'interface' from source: set_fact 15794 1726882612.81502: variable 'omit' from source: magic vars 15794 1726882612.81554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882612.81623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882612.81655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882612.81703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882612.81715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882612.81763: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882612.81812: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.81816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.81932: Set connection var ansible_connection to ssh 15794 1726882612.81950: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882612.81963: Set connection var ansible_pipelining to False 15794 1726882612.81976: Set connection var ansible_shell_executable to /bin/sh 15794 1726882612.81988: Set connection var ansible_shell_type to sh 15794 1726882612.82030: Set connection var ansible_timeout to 10 15794 1726882612.82052: variable 'ansible_shell_executable' from source: unknown 15794 1726882612.82061: variable 'ansible_connection' from source: unknown 15794 1726882612.82070: variable 'ansible_module_compression' from source: unknown 15794 1726882612.82077: variable 'ansible_shell_type' from source: unknown 15794 1726882612.82138: variable 'ansible_shell_executable' from source: unknown 15794 1726882612.82141: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882612.82150: variable 'ansible_pipelining' from source: unknown 15794 1726882612.82153: variable 'ansible_timeout' from source: unknown 15794 1726882612.82155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882612.82319: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882612.82341: variable 'omit' from source: magic vars 15794 1726882612.82353: starting attempt loop 15794 1726882612.82372: running the handler 15794 1726882612.82396: _low_level_execute_command(): starting 15794 1726882612.82440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882612.83252: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.83403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.83428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882612.83459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882612.83490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.83752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.85325: stdout chunk (state=3): >>>/root <<< 15794 1726882612.85502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.85519: stdout chunk (state=3): >>><<< 15794 1726882612.85770: stderr chunk (state=3): >>><<< 15794 1726882612.85775: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.85781: _low_level_execute_command(): starting 15794 1726882612.85784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329 `" && echo ansible-tmp-1726882612.8567173-16153-116296036291329="` echo /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329 `" ) && sleep 0' 15794 1726882612.86895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882612.86899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882612.86910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.86913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882612.87339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.87343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.89264: stdout chunk (state=3): >>>ansible-tmp-1726882612.8567173-16153-116296036291329=/root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329 <<< 15794 1726882612.89377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.89570: stderr chunk (state=3): >>><<< 15794 1726882612.89573: stdout chunk (state=3): >>><<< 15794 1726882612.89592: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882612.8567173-16153-116296036291329=/root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.89840: variable 'ansible_module_compression' from source: unknown 15794 1726882612.89844: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882612.89846: variable 'ansible_facts' from source: unknown 15794 1726882612.90006: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py 15794 1726882612.90427: Sending initial data 15794 1726882612.90431: Sent initial data (156 bytes) 15794 1726882612.91744: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882612.91855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.91944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.93561: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15794 1726882612.93575: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882612.93615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882612.93821: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpx06w5z0t /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py <<< 15794 1726882612.93825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py" <<< 15794 1726882612.93873: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpx06w5z0t" to remote "/root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py" <<< 15794 1726882612.95800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.95804: stdout chunk (state=3): >>><<< 15794 1726882612.95813: stderr chunk (state=3): >>><<< 15794 1726882612.95838: done transferring module to remote 15794 1726882612.95852: _low_level_execute_command(): starting 15794 1726882612.95858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/ /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py && sleep 0' 15794 1726882612.96988: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882612.97349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882612.97367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882612.97522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882612.99319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882612.99413: stderr chunk (state=3): >>><<< 15794 1726882612.99416: stdout chunk (state=3): >>><<< 15794 1726882612.99444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882612.99447: _low_level_execute_command(): starting 15794 1726882612.99454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/AnsiballZ_command.py && sleep 0' 15794 1726882613.00139: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882613.00148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882613.00160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882613.00176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882613.00218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882613.00309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.00343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.00437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.19763: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:36:53.174236", "end": "2024-09-20 21:36:53.195752", "delta": "0:00:00.021516", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882613.21571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882613.21597: stderr chunk (state=3): >>><<< 15794 1726882613.21607: stdout chunk (state=3): >>><<< 15794 1726882613.21638: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:36:53.174236", "end": "2024-09-20 21:36:53.195752", "delta": "0:00:00.021516", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882613.21717: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882613.21738: _low_level_execute_command(): starting 15794 1726882613.21751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882612.8567173-16153-116296036291329/ > /dev/null 2>&1 && sleep 0' 15794 1726882613.22438: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882613.22455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882613.22469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882613.22504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882613.22615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.22647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.22745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.24856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882613.24860: stdout chunk (state=3): >>><<< 15794 1726882613.24863: stderr chunk (state=3): >>><<< 15794 1726882613.24866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882613.24868: handler run complete 15794 1726882613.24886: Evaluated conditional (False): False 15794 1726882613.24909: attempt loop complete, returning result 15794 1726882613.24918: _execute() done 15794 1726882613.24926: dumping result to json 15794 1726882613.24940: done dumping result, returning 15794 1726882613.24964: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0affe814-3a2d-94e5-e48f-000000000136] 15794 1726882613.24976: sending task result for task 0affe814-3a2d-94e5-e48f-000000000136 ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.021516", "end": "2024-09-20 21:36:53.195752", "rc": 0, "start": "2024-09-20 21:36:53.174236" } 15794 1726882613.25292: no more pending results, returning what we have 15794 1726882613.25296: results queue empty 15794 1726882613.25300: checking for any_errors_fatal 15794 1726882613.25318: done checking for any_errors_fatal 15794 1726882613.25319: checking for max_fail_percentage 15794 1726882613.25321: done checking for max_fail_percentage 15794 1726882613.25322: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.25324: done checking to see if all hosts have failed 15794 1726882613.25325: getting the remaining hosts for this loop 15794 1726882613.25327: done getting the remaining hosts for this loop 15794 1726882613.25333: getting the next task for host managed_node1 15794 1726882613.25343: done getting next task for host managed_node1 15794 1726882613.25347: ^ task is: TASK: Delete veth interface {{ interface }} 15794 1726882613.25351: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.25355: getting variables 15794 1726882613.25358: in VariableManager get_vars() 15794 1726882613.25398: Calling all_inventory to load vars for managed_node1 15794 1726882613.25402: Calling groups_inventory to load vars for managed_node1 15794 1726882613.25407: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.25421: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.25425: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.25429: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.25727: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000136 15794 1726882613.25731: WORKER PROCESS EXITING 15794 1726882613.25984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.26283: done with get_vars() 15794 1726882613.26297: done getting variables 15794 1726882613.26375: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882613.26528: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:36:53 -0400 (0:00:00.475) 0:00:10.823 ****** 15794 1726882613.26566: entering _queue_task() for managed_node1/command 15794 1726882613.26982: worker is 1 (out of 1 available) 15794 1726882613.26995: exiting _queue_task() for managed_node1/command 15794 1726882613.27006: done queuing things up, now waiting for results queue to drain 15794 1726882613.27007: waiting for pending results... 15794 1726882613.27194: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 15794 1726882613.27325: in run() - task 0affe814-3a2d-94e5-e48f-000000000137 15794 1726882613.27354: variable 'ansible_search_path' from source: unknown 15794 1726882613.27361: variable 'ansible_search_path' from source: unknown 15794 1726882613.27409: calling self._execute() 15794 1726882613.27509: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.27522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.27538: variable 'omit' from source: magic vars 15794 1726882613.27977: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.28004: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.28660: variable 'type' from source: set_fact 15794 1726882613.28672: variable 'state' from source: include params 15794 1726882613.28686: variable 'interface' from source: set_fact 15794 1726882613.28696: variable 'current_interfaces' from source: set_fact 15794 1726882613.28717: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 15794 1726882613.28726: when evaluation is False, skipping this task 15794 1726882613.28736: _execute() done 15794 1726882613.28745: dumping result to json 15794 1726882613.28761: done dumping result, returning 15794 1726882613.28815: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 [0affe814-3a2d-94e5-e48f-000000000137] 15794 1726882613.28818: sending task result for task 0affe814-3a2d-94e5-e48f-000000000137 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 15794 1726882613.29054: no more pending results, returning what we have 15794 1726882613.29059: results queue empty 15794 1726882613.29061: checking for any_errors_fatal 15794 1726882613.29069: done checking for any_errors_fatal 15794 1726882613.29071: checking for max_fail_percentage 15794 1726882613.29073: done checking for max_fail_percentage 15794 1726882613.29074: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.29075: done checking to see if all hosts have failed 15794 1726882613.29076: getting the remaining hosts for this loop 15794 1726882613.29082: done getting the remaining hosts for this loop 15794 1726882613.29086: getting the next task for host managed_node1 15794 1726882613.29095: done getting next task for host managed_node1 15794 1726882613.29098: ^ task is: TASK: Create dummy interface {{ interface }} 15794 1726882613.29101: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.29105: getting variables 15794 1726882613.29107: in VariableManager get_vars() 15794 1726882613.29145: Calling all_inventory to load vars for managed_node1 15794 1726882613.29149: Calling groups_inventory to load vars for managed_node1 15794 1726882613.29154: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.29169: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.29173: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.29181: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.29701: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000137 15794 1726882613.29705: WORKER PROCESS EXITING 15794 1726882613.29799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.30081: done with get_vars() 15794 1726882613.30093: done getting variables 15794 1726882613.30172: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882613.30295: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:36:53 -0400 (0:00:00.037) 0:00:10.861 ****** 15794 1726882613.30325: entering _queue_task() for managed_node1/command 15794 1726882613.30671: worker is 1 (out of 1 available) 15794 1726882613.30687: exiting _queue_task() for managed_node1/command 15794 1726882613.30700: done queuing things up, now waiting for results queue to drain 15794 1726882613.30701: waiting for pending results... 15794 1726882613.31014: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 15794 1726882613.31066: in run() - task 0affe814-3a2d-94e5-e48f-000000000138 15794 1726882613.31110: variable 'ansible_search_path' from source: unknown 15794 1726882613.31114: variable 'ansible_search_path' from source: unknown 15794 1726882613.31153: calling self._execute() 15794 1726882613.31328: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.31332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.31338: variable 'omit' from source: magic vars 15794 1726882613.31742: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.31769: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.32073: variable 'type' from source: set_fact 15794 1726882613.32095: variable 'state' from source: include params 15794 1726882613.32110: variable 'interface' from source: set_fact 15794 1726882613.32122: variable 'current_interfaces' from source: set_fact 15794 1726882613.32140: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 15794 1726882613.32150: when evaluation is False, skipping this task 15794 1726882613.32159: _execute() done 15794 1726882613.32166: dumping result to json 15794 1726882613.32198: done dumping result, returning 15794 1726882613.32202: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 [0affe814-3a2d-94e5-e48f-000000000138] 15794 1726882613.32205: sending task result for task 0affe814-3a2d-94e5-e48f-000000000138 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 15794 1726882613.32484: no more pending results, returning what we have 15794 1726882613.32489: results queue empty 15794 1726882613.32491: checking for any_errors_fatal 15794 1726882613.32501: done checking for any_errors_fatal 15794 1726882613.32503: checking for max_fail_percentage 15794 1726882613.32505: done checking for max_fail_percentage 15794 1726882613.32506: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.32507: done checking to see if all hosts have failed 15794 1726882613.32508: getting the remaining hosts for this loop 15794 1726882613.32511: done getting the remaining hosts for this loop 15794 1726882613.32515: getting the next task for host managed_node1 15794 1726882613.32524: done getting next task for host managed_node1 15794 1726882613.32527: ^ task is: TASK: Delete dummy interface {{ interface }} 15794 1726882613.32532: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.32538: getting variables 15794 1726882613.32540: in VariableManager get_vars() 15794 1726882613.32576: Calling all_inventory to load vars for managed_node1 15794 1726882613.32580: Calling groups_inventory to load vars for managed_node1 15794 1726882613.32585: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.32603: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.32607: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.32611: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.32981: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000138 15794 1726882613.32985: WORKER PROCESS EXITING 15794 1726882613.33015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.33317: done with get_vars() 15794 1726882613.33329: done getting variables 15794 1726882613.33396: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882613.33529: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:36:53 -0400 (0:00:00.032) 0:00:10.893 ****** 15794 1726882613.33565: entering _queue_task() for managed_node1/command 15794 1726882613.33887: worker is 1 (out of 1 available) 15794 1726882613.33900: exiting _queue_task() for managed_node1/command 15794 1726882613.33912: done queuing things up, now waiting for results queue to drain 15794 1726882613.33914: waiting for pending results... 15794 1726882613.34143: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 15794 1726882613.34269: in run() - task 0affe814-3a2d-94e5-e48f-000000000139 15794 1726882613.34298: variable 'ansible_search_path' from source: unknown 15794 1726882613.34306: variable 'ansible_search_path' from source: unknown 15794 1726882613.34353: calling self._execute() 15794 1726882613.34453: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.34467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.34484: variable 'omit' from source: magic vars 15794 1726882613.34906: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.34925: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.35281: variable 'type' from source: set_fact 15794 1726882613.35294: variable 'state' from source: include params 15794 1726882613.35304: variable 'interface' from source: set_fact 15794 1726882613.35312: variable 'current_interfaces' from source: set_fact 15794 1726882613.35325: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 15794 1726882613.35332: when evaluation is False, skipping this task 15794 1726882613.35342: _execute() done 15794 1726882613.35350: dumping result to json 15794 1726882613.35357: done dumping result, returning 15794 1726882613.35366: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 [0affe814-3a2d-94e5-e48f-000000000139] 15794 1726882613.35386: sending task result for task 0affe814-3a2d-94e5-e48f-000000000139 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 15794 1726882613.35527: no more pending results, returning what we have 15794 1726882613.35532: results queue empty 15794 1726882613.35533: checking for any_errors_fatal 15794 1726882613.35541: done checking for any_errors_fatal 15794 1726882613.35542: checking for max_fail_percentage 15794 1726882613.35544: done checking for max_fail_percentage 15794 1726882613.35545: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.35546: done checking to see if all hosts have failed 15794 1726882613.35546: getting the remaining hosts for this loop 15794 1726882613.35548: done getting the remaining hosts for this loop 15794 1726882613.35553: getting the next task for host managed_node1 15794 1726882613.35560: done getting next task for host managed_node1 15794 1726882613.35563: ^ task is: TASK: Create tap interface {{ interface }} 15794 1726882613.35566: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.35570: getting variables 15794 1726882613.35572: in VariableManager get_vars() 15794 1726882613.35603: Calling all_inventory to load vars for managed_node1 15794 1726882613.35607: Calling groups_inventory to load vars for managed_node1 15794 1726882613.35611: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.35625: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.35628: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.35632: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.36093: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000139 15794 1726882613.36097: WORKER PROCESS EXITING 15794 1726882613.36124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.36402: done with get_vars() 15794 1726882613.36412: done getting variables 15794 1726882613.36470: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882613.36594: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:36:53 -0400 (0:00:00.030) 0:00:10.924 ****** 15794 1726882613.36625: entering _queue_task() for managed_node1/command 15794 1726882613.36956: worker is 1 (out of 1 available) 15794 1726882613.36968: exiting _queue_task() for managed_node1/command 15794 1726882613.36978: done queuing things up, now waiting for results queue to drain 15794 1726882613.36980: waiting for pending results... 15794 1726882613.37166: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 15794 1726882613.37302: in run() - task 0affe814-3a2d-94e5-e48f-00000000013a 15794 1726882613.37328: variable 'ansible_search_path' from source: unknown 15794 1726882613.37339: variable 'ansible_search_path' from source: unknown 15794 1726882613.37389: calling self._execute() 15794 1726882613.37498: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.37512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.37533: variable 'omit' from source: magic vars 15794 1726882613.37999: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.38025: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.38304: variable 'type' from source: set_fact 15794 1726882613.38316: variable 'state' from source: include params 15794 1726882613.38326: variable 'interface' from source: set_fact 15794 1726882613.38337: variable 'current_interfaces' from source: set_fact 15794 1726882613.38354: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 15794 1726882613.38363: when evaluation is False, skipping this task 15794 1726882613.38370: _execute() done 15794 1726882613.38377: dumping result to json 15794 1726882613.38386: done dumping result, returning 15794 1726882613.38396: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 [0affe814-3a2d-94e5-e48f-00000000013a] 15794 1726882613.38412: sending task result for task 0affe814-3a2d-94e5-e48f-00000000013a skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 15794 1726882613.38682: no more pending results, returning what we have 15794 1726882613.38686: results queue empty 15794 1726882613.38688: checking for any_errors_fatal 15794 1726882613.38693: done checking for any_errors_fatal 15794 1726882613.38694: checking for max_fail_percentage 15794 1726882613.38695: done checking for max_fail_percentage 15794 1726882613.38696: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.38697: done checking to see if all hosts have failed 15794 1726882613.38698: getting the remaining hosts for this loop 15794 1726882613.38700: done getting the remaining hosts for this loop 15794 1726882613.38704: getting the next task for host managed_node1 15794 1726882613.38709: done getting next task for host managed_node1 15794 1726882613.38712: ^ task is: TASK: Delete tap interface {{ interface }} 15794 1726882613.38715: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.38718: getting variables 15794 1726882613.38720: in VariableManager get_vars() 15794 1726882613.38752: Calling all_inventory to load vars for managed_node1 15794 1726882613.38755: Calling groups_inventory to load vars for managed_node1 15794 1726882613.38760: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.38835: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.38841: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.38849: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000013a 15794 1726882613.38852: WORKER PROCESS EXITING 15794 1726882613.38857: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.39117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.39367: done with get_vars() 15794 1726882613.39378: done getting variables 15794 1726882613.39448: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882613.39565: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:36:53 -0400 (0:00:00.029) 0:00:10.953 ****** 15794 1726882613.39594: entering _queue_task() for managed_node1/command 15794 1726882613.39840: worker is 1 (out of 1 available) 15794 1726882613.39854: exiting _queue_task() for managed_node1/command 15794 1726882613.39981: done queuing things up, now waiting for results queue to drain 15794 1726882613.39983: waiting for pending results... 15794 1726882613.40148: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 15794 1726882613.40273: in run() - task 0affe814-3a2d-94e5-e48f-00000000013b 15794 1726882613.40295: variable 'ansible_search_path' from source: unknown 15794 1726882613.40309: variable 'ansible_search_path' from source: unknown 15794 1726882613.40359: calling self._execute() 15794 1726882613.40465: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.40480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.40500: variable 'omit' from source: magic vars 15794 1726882613.41043: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.41062: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.41372: variable 'type' from source: set_fact 15794 1726882613.41385: variable 'state' from source: include params 15794 1726882613.41405: variable 'interface' from source: set_fact 15794 1726882613.41420: variable 'current_interfaces' from source: set_fact 15794 1726882613.41508: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 15794 1726882613.41512: when evaluation is False, skipping this task 15794 1726882613.41516: _execute() done 15794 1726882613.41519: dumping result to json 15794 1726882613.41521: done dumping result, returning 15794 1726882613.41524: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 [0affe814-3a2d-94e5-e48f-00000000013b] 15794 1726882613.41526: sending task result for task 0affe814-3a2d-94e5-e48f-00000000013b 15794 1726882613.41599: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000013b 15794 1726882613.41602: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 15794 1726882613.41661: no more pending results, returning what we have 15794 1726882613.41666: results queue empty 15794 1726882613.41667: checking for any_errors_fatal 15794 1726882613.41674: done checking for any_errors_fatal 15794 1726882613.41675: checking for max_fail_percentage 15794 1726882613.41677: done checking for max_fail_percentage 15794 1726882613.41678: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.41679: done checking to see if all hosts have failed 15794 1726882613.41680: getting the remaining hosts for this loop 15794 1726882613.41682: done getting the remaining hosts for this loop 15794 1726882613.41687: getting the next task for host managed_node1 15794 1726882613.41698: done getting next task for host managed_node1 15794 1726882613.41702: ^ task is: TASK: Include the task 'assert_device_present.yml' 15794 1726882613.41705: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.41709: getting variables 15794 1726882613.41712: in VariableManager get_vars() 15794 1726882613.41752: Calling all_inventory to load vars for managed_node1 15794 1726882613.41756: Calling groups_inventory to load vars for managed_node1 15794 1726882613.41762: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.41778: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.41782: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.41786: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.42349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.43063: done with get_vars() 15794 1726882613.43075: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 21:36:53 -0400 (0:00:00.036) 0:00:10.990 ****** 15794 1726882613.43290: entering _queue_task() for managed_node1/include_tasks 15794 1726882613.43843: worker is 1 (out of 1 available) 15794 1726882613.43855: exiting _queue_task() for managed_node1/include_tasks 15794 1726882613.43869: done queuing things up, now waiting for results queue to drain 15794 1726882613.43870: waiting for pending results... 15794 1726882613.44456: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 15794 1726882613.44640: in run() - task 0affe814-3a2d-94e5-e48f-000000000012 15794 1726882613.44840: variable 'ansible_search_path' from source: unknown 15794 1726882613.44844: calling self._execute() 15794 1726882613.44847: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.44850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.44852: variable 'omit' from source: magic vars 15794 1726882613.45876: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.45901: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.45916: _execute() done 15794 1726882613.45927: dumping result to json 15794 1726882613.45942: done dumping result, returning 15794 1726882613.45961: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0affe814-3a2d-94e5-e48f-000000000012] 15794 1726882613.45974: sending task result for task 0affe814-3a2d-94e5-e48f-000000000012 15794 1726882613.46200: no more pending results, returning what we have 15794 1726882613.46207: in VariableManager get_vars() 15794 1726882613.46245: Calling all_inventory to load vars for managed_node1 15794 1726882613.46249: Calling groups_inventory to load vars for managed_node1 15794 1726882613.46254: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.46269: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.46273: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.46277: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.46597: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000012 15794 1726882613.46601: WORKER PROCESS EXITING 15794 1726882613.47005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.47507: done with get_vars() 15794 1726882613.47515: variable 'ansible_search_path' from source: unknown 15794 1726882613.47528: we have included files to process 15794 1726882613.47530: generating all_blocks data 15794 1726882613.47531: done generating all_blocks data 15794 1726882613.47740: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15794 1726882613.47743: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15794 1726882613.47746: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15794 1726882613.47930: in VariableManager get_vars() 15794 1726882613.47951: done with get_vars() 15794 1726882613.48312: done processing included file 15794 1726882613.48314: iterating over new_blocks loaded from include file 15794 1726882613.48316: in VariableManager get_vars() 15794 1726882613.48329: done with get_vars() 15794 1726882613.48331: filtering new block on tags 15794 1726882613.48353: done filtering new block on tags 15794 1726882613.48356: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 15794 1726882613.48362: extending task lists for all hosts with included blocks 15794 1726882613.49607: done extending task lists 15794 1726882613.49609: done processing included files 15794 1726882613.49610: results queue empty 15794 1726882613.49611: checking for any_errors_fatal 15794 1726882613.49615: done checking for any_errors_fatal 15794 1726882613.49616: checking for max_fail_percentage 15794 1726882613.49617: done checking for max_fail_percentage 15794 1726882613.49618: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.49619: done checking to see if all hosts have failed 15794 1726882613.49620: getting the remaining hosts for this loop 15794 1726882613.49622: done getting the remaining hosts for this loop 15794 1726882613.49625: getting the next task for host managed_node1 15794 1726882613.49630: done getting next task for host managed_node1 15794 1726882613.49632: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15794 1726882613.49638: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.49641: getting variables 15794 1726882613.49642: in VariableManager get_vars() 15794 1726882613.49652: Calling all_inventory to load vars for managed_node1 15794 1726882613.49655: Calling groups_inventory to load vars for managed_node1 15794 1726882613.49658: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.49664: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.49667: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.49671: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.49892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.50150: done with get_vars() 15794 1726882613.50162: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.069) 0:00:11.060 ****** 15794 1726882613.50250: entering _queue_task() for managed_node1/include_tasks 15794 1726882613.50680: worker is 1 (out of 1 available) 15794 1726882613.50691: exiting _queue_task() for managed_node1/include_tasks 15794 1726882613.50703: done queuing things up, now waiting for results queue to drain 15794 1726882613.50704: waiting for pending results... 15794 1726882613.50886: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15794 1726882613.51006: in run() - task 0affe814-3a2d-94e5-e48f-0000000001d3 15794 1726882613.51026: variable 'ansible_search_path' from source: unknown 15794 1726882613.51037: variable 'ansible_search_path' from source: unknown 15794 1726882613.51084: calling self._execute() 15794 1726882613.51340: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.51344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.51347: variable 'omit' from source: magic vars 15794 1726882613.51643: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.51661: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.51675: _execute() done 15794 1726882613.51687: dumping result to json 15794 1726882613.51697: done dumping result, returning 15794 1726882613.51739: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-94e5-e48f-0000000001d3] 15794 1726882613.51742: sending task result for task 0affe814-3a2d-94e5-e48f-0000000001d3 15794 1726882613.51974: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000001d3 15794 1726882613.51978: WORKER PROCESS EXITING 15794 1726882613.52004: no more pending results, returning what we have 15794 1726882613.52009: in VariableManager get_vars() 15794 1726882613.52045: Calling all_inventory to load vars for managed_node1 15794 1726882613.52049: Calling groups_inventory to load vars for managed_node1 15794 1726882613.52053: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.52065: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.52069: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.52073: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.52351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.52610: done with get_vars() 15794 1726882613.52618: variable 'ansible_search_path' from source: unknown 15794 1726882613.52620: variable 'ansible_search_path' from source: unknown 15794 1726882613.52660: we have included files to process 15794 1726882613.52661: generating all_blocks data 15794 1726882613.52663: done generating all_blocks data 15794 1726882613.52664: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15794 1726882613.52666: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15794 1726882613.52668: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15794 1726882613.52927: done processing included file 15794 1726882613.52929: iterating over new_blocks loaded from include file 15794 1726882613.52931: in VariableManager get_vars() 15794 1726882613.52947: done with get_vars() 15794 1726882613.52949: filtering new block on tags 15794 1726882613.52966: done filtering new block on tags 15794 1726882613.52969: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15794 1726882613.52974: extending task lists for all hosts with included blocks 15794 1726882613.53103: done extending task lists 15794 1726882613.53105: done processing included files 15794 1726882613.53106: results queue empty 15794 1726882613.53107: checking for any_errors_fatal 15794 1726882613.53110: done checking for any_errors_fatal 15794 1726882613.53112: checking for max_fail_percentage 15794 1726882613.53113: done checking for max_fail_percentage 15794 1726882613.53114: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.53115: done checking to see if all hosts have failed 15794 1726882613.53116: getting the remaining hosts for this loop 15794 1726882613.53117: done getting the remaining hosts for this loop 15794 1726882613.53120: getting the next task for host managed_node1 15794 1726882613.53125: done getting next task for host managed_node1 15794 1726882613.53128: ^ task is: TASK: Get stat for interface {{ interface }} 15794 1726882613.53131: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.53135: getting variables 15794 1726882613.53137: in VariableManager get_vars() 15794 1726882613.53146: Calling all_inventory to load vars for managed_node1 15794 1726882613.53149: Calling groups_inventory to load vars for managed_node1 15794 1726882613.53152: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.53157: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.53160: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.53163: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.53382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.53654: done with get_vars() 15794 1726882613.53665: done getting variables 15794 1726882613.53841: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.036) 0:00:11.096 ****** 15794 1726882613.53873: entering _queue_task() for managed_node1/stat 15794 1726882613.54150: worker is 1 (out of 1 available) 15794 1726882613.54163: exiting _queue_task() for managed_node1/stat 15794 1726882613.54175: done queuing things up, now waiting for results queue to drain 15794 1726882613.54177: waiting for pending results... 15794 1726882613.54448: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 15794 1726882613.54580: in run() - task 0affe814-3a2d-94e5-e48f-00000000021e 15794 1726882613.54604: variable 'ansible_search_path' from source: unknown 15794 1726882613.54614: variable 'ansible_search_path' from source: unknown 15794 1726882613.54669: calling self._execute() 15794 1726882613.54840: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.54844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.54846: variable 'omit' from source: magic vars 15794 1726882613.55209: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.55226: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.55240: variable 'omit' from source: magic vars 15794 1726882613.55304: variable 'omit' from source: magic vars 15794 1726882613.55426: variable 'interface' from source: set_fact 15794 1726882613.55453: variable 'omit' from source: magic vars 15794 1726882613.55501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882613.55550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882613.55579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882613.55607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882613.55624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882613.55839: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882613.55842: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.55845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.55847: Set connection var ansible_connection to ssh 15794 1726882613.55849: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882613.55851: Set connection var ansible_pipelining to False 15794 1726882613.55853: Set connection var ansible_shell_executable to /bin/sh 15794 1726882613.55856: Set connection var ansible_shell_type to sh 15794 1726882613.55858: Set connection var ansible_timeout to 10 15794 1726882613.55889: variable 'ansible_shell_executable' from source: unknown 15794 1726882613.55898: variable 'ansible_connection' from source: unknown 15794 1726882613.55905: variable 'ansible_module_compression' from source: unknown 15794 1726882613.55912: variable 'ansible_shell_type' from source: unknown 15794 1726882613.55920: variable 'ansible_shell_executable' from source: unknown 15794 1726882613.55927: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.55937: variable 'ansible_pipelining' from source: unknown 15794 1726882613.55945: variable 'ansible_timeout' from source: unknown 15794 1726882613.55954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.56195: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882613.56212: variable 'omit' from source: magic vars 15794 1726882613.56223: starting attempt loop 15794 1726882613.56231: running the handler 15794 1726882613.56251: _low_level_execute_command(): starting 15794 1726882613.56267: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882613.57017: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882613.57031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882613.57160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882613.57182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.57210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.57309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.59076: stdout chunk (state=3): >>>/root <<< 15794 1726882613.59293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882613.59296: stdout chunk (state=3): >>><<< 15794 1726882613.59299: stderr chunk (state=3): >>><<< 15794 1726882613.59439: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882613.59444: _low_level_execute_command(): starting 15794 1726882613.59447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439 `" && echo ansible-tmp-1726882613.5932832-16188-115197257139439="` echo /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439 `" ) && sleep 0' 15794 1726882613.60051: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882613.60080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882613.60098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.60122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.60221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.62283: stdout chunk (state=3): >>>ansible-tmp-1726882613.5932832-16188-115197257139439=/root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439 <<< 15794 1726882613.62468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882613.62484: stdout chunk (state=3): >>><<< 15794 1726882613.62497: stderr chunk (state=3): >>><<< 15794 1726882613.62517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882613.5932832-16188-115197257139439=/root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882613.62639: variable 'ansible_module_compression' from source: unknown 15794 1726882613.62642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15794 1726882613.62684: variable 'ansible_facts' from source: unknown 15794 1726882613.62790: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py 15794 1726882613.63001: Sending initial data 15794 1726882613.63005: Sent initial data (153 bytes) 15794 1726882613.63609: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882613.63640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882613.63662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882613.63775: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.63816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.63866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.65551: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882613.65604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882613.65693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp9nq79sbg /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py <<< 15794 1726882613.65699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py" <<< 15794 1726882613.65722: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp9nq79sbg" to remote "/root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py" <<< 15794 1726882613.67048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882613.67064: stderr chunk (state=3): >>><<< 15794 1726882613.67074: stdout chunk (state=3): >>><<< 15794 1726882613.67108: done transferring module to remote 15794 1726882613.67143: _low_level_execute_command(): starting 15794 1726882613.67146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/ /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py && sleep 0' 15794 1726882613.67922: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882613.67941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882613.68021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882613.68051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.68077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.68173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.70167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882613.70170: stdout chunk (state=3): >>><<< 15794 1726882613.70173: stderr chunk (state=3): >>><<< 15794 1726882613.70241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882613.70248: _low_level_execute_command(): starting 15794 1726882613.70251: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/AnsiballZ_stat.py && sleep 0' 15794 1726882613.70900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882613.70917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882613.70936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882613.70956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882613.71010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882613.71085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882613.71116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.71136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.71470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.88824: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36762, "dev": 23, "nlink": 1, "atime": 1726882611.9161708, "mtime": 1726882611.9161708, "ctime": 1726882611.9161708, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15794 1726882613.90353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882613.90364: stdout chunk (state=3): >>><<< 15794 1726882613.90380: stderr chunk (state=3): >>><<< 15794 1726882613.90405: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36762, "dev": 23, "nlink": 1, "atime": 1726882611.9161708, "mtime": 1726882611.9161708, "ctime": 1726882611.9161708, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882613.90540: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882613.90544: _low_level_execute_command(): starting 15794 1726882613.90546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882613.5932832-16188-115197257139439/ > /dev/null 2>&1 && sleep 0' 15794 1726882613.91116: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882613.91131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882613.91257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882613.91269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882613.91286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882613.91372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882613.93437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882613.93452: stdout chunk (state=3): >>><<< 15794 1726882613.93466: stderr chunk (state=3): >>><<< 15794 1726882613.93843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882613.93852: handler run complete 15794 1726882613.93855: attempt loop complete, returning result 15794 1726882613.93858: _execute() done 15794 1726882613.93860: dumping result to json 15794 1726882613.93862: done dumping result, returning 15794 1726882613.93864: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [0affe814-3a2d-94e5-e48f-00000000021e] 15794 1726882613.93867: sending task result for task 0affe814-3a2d-94e5-e48f-00000000021e ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882611.9161708, "block_size": 4096, "blocks": 0, "ctime": 1726882611.9161708, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 36762, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726882611.9161708, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15794 1726882613.94084: no more pending results, returning what we have 15794 1726882613.94090: results queue empty 15794 1726882613.94092: checking for any_errors_fatal 15794 1726882613.94094: done checking for any_errors_fatal 15794 1726882613.94095: checking for max_fail_percentage 15794 1726882613.94097: done checking for max_fail_percentage 15794 1726882613.94098: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.94099: done checking to see if all hosts have failed 15794 1726882613.94100: getting the remaining hosts for this loop 15794 1726882613.94102: done getting the remaining hosts for this loop 15794 1726882613.94107: getting the next task for host managed_node1 15794 1726882613.94117: done getting next task for host managed_node1 15794 1726882613.94121: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15794 1726882613.94125: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.94241: getting variables 15794 1726882613.94244: in VariableManager get_vars() 15794 1726882613.94279: Calling all_inventory to load vars for managed_node1 15794 1726882613.94283: Calling groups_inventory to load vars for managed_node1 15794 1726882613.94287: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.94449: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.94454: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.94462: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.94666: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000021e 15794 1726882613.94670: WORKER PROCESS EXITING 15794 1726882613.94696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.95006: done with get_vars() 15794 1726882613.95017: done getting variables 15794 1726882613.95126: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15794 1726882613.95265: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:36:53 -0400 (0:00:00.414) 0:00:11.510 ****** 15794 1726882613.95300: entering _queue_task() for managed_node1/assert 15794 1726882613.95302: Creating lock for assert 15794 1726882613.95763: worker is 1 (out of 1 available) 15794 1726882613.95774: exiting _queue_task() for managed_node1/assert 15794 1726882613.95784: done queuing things up, now waiting for results queue to drain 15794 1726882613.95786: waiting for pending results... 15794 1726882613.95882: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' 15794 1726882613.96012: in run() - task 0affe814-3a2d-94e5-e48f-0000000001d4 15794 1726882613.96032: variable 'ansible_search_path' from source: unknown 15794 1726882613.96047: variable 'ansible_search_path' from source: unknown 15794 1726882613.96121: calling self._execute() 15794 1726882613.96204: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.96228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.96338: variable 'omit' from source: magic vars 15794 1726882613.96695: variable 'ansible_distribution_major_version' from source: facts 15794 1726882613.96713: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882613.96724: variable 'omit' from source: magic vars 15794 1726882613.96776: variable 'omit' from source: magic vars 15794 1726882613.96903: variable 'interface' from source: set_fact 15794 1726882613.96926: variable 'omit' from source: magic vars 15794 1726882613.96977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882613.97037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882613.97066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882613.97100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882613.97117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882613.97160: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882613.97172: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.97181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.97312: Set connection var ansible_connection to ssh 15794 1726882613.97328: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882613.97345: Set connection var ansible_pipelining to False 15794 1726882613.97420: Set connection var ansible_shell_executable to /bin/sh 15794 1726882613.97424: Set connection var ansible_shell_type to sh 15794 1726882613.97426: Set connection var ansible_timeout to 10 15794 1726882613.97428: variable 'ansible_shell_executable' from source: unknown 15794 1726882613.97431: variable 'ansible_connection' from source: unknown 15794 1726882613.97433: variable 'ansible_module_compression' from source: unknown 15794 1726882613.97441: variable 'ansible_shell_type' from source: unknown 15794 1726882613.97450: variable 'ansible_shell_executable' from source: unknown 15794 1726882613.97458: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882613.97467: variable 'ansible_pipelining' from source: unknown 15794 1726882613.97475: variable 'ansible_timeout' from source: unknown 15794 1726882613.97484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882613.97676: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882613.97695: variable 'omit' from source: magic vars 15794 1726882613.97706: starting attempt loop 15794 1726882613.97713: running the handler 15794 1726882613.97969: variable 'interface_stat' from source: set_fact 15794 1726882613.97972: Evaluated conditional (interface_stat.stat.exists): True 15794 1726882613.97974: handler run complete 15794 1726882613.97976: attempt loop complete, returning result 15794 1726882613.97978: _execute() done 15794 1726882613.97980: dumping result to json 15794 1726882613.97982: done dumping result, returning 15794 1726882613.97984: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' [0affe814-3a2d-94e5-e48f-0000000001d4] 15794 1726882613.97987: sending task result for task 0affe814-3a2d-94e5-e48f-0000000001d4 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15794 1726882613.98125: no more pending results, returning what we have 15794 1726882613.98129: results queue empty 15794 1726882613.98130: checking for any_errors_fatal 15794 1726882613.98141: done checking for any_errors_fatal 15794 1726882613.98142: checking for max_fail_percentage 15794 1726882613.98144: done checking for max_fail_percentage 15794 1726882613.98145: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.98146: done checking to see if all hosts have failed 15794 1726882613.98147: getting the remaining hosts for this loop 15794 1726882613.98149: done getting the remaining hosts for this loop 15794 1726882613.98154: getting the next task for host managed_node1 15794 1726882613.98164: done getting next task for host managed_node1 15794 1726882613.98167: ^ task is: TASK: meta (flush_handlers) 15794 1726882613.98170: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.98175: getting variables 15794 1726882613.98181: in VariableManager get_vars() 15794 1726882613.98215: Calling all_inventory to load vars for managed_node1 15794 1726882613.98218: Calling groups_inventory to load vars for managed_node1 15794 1726882613.98222: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.98337: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.98343: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.98349: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000001d4 15794 1726882613.98352: WORKER PROCESS EXITING 15794 1726882613.98357: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.98737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.98985: done with get_vars() 15794 1726882613.98997: done getting variables 15794 1726882613.99072: in VariableManager get_vars() 15794 1726882613.99086: Calling all_inventory to load vars for managed_node1 15794 1726882613.99089: Calling groups_inventory to load vars for managed_node1 15794 1726882613.99092: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.99097: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.99100: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.99103: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.99325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882613.99603: done with get_vars() 15794 1726882613.99619: done queuing things up, now waiting for results queue to drain 15794 1726882613.99621: results queue empty 15794 1726882613.99622: checking for any_errors_fatal 15794 1726882613.99625: done checking for any_errors_fatal 15794 1726882613.99626: checking for max_fail_percentage 15794 1726882613.99627: done checking for max_fail_percentage 15794 1726882613.99628: checking to see if all hosts have failed and the running result is not ok 15794 1726882613.99629: done checking to see if all hosts have failed 15794 1726882613.99638: getting the remaining hosts for this loop 15794 1726882613.99640: done getting the remaining hosts for this loop 15794 1726882613.99643: getting the next task for host managed_node1 15794 1726882613.99647: done getting next task for host managed_node1 15794 1726882613.99649: ^ task is: TASK: meta (flush_handlers) 15794 1726882613.99651: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882613.99654: getting variables 15794 1726882613.99655: in VariableManager get_vars() 15794 1726882613.99665: Calling all_inventory to load vars for managed_node1 15794 1726882613.99667: Calling groups_inventory to load vars for managed_node1 15794 1726882613.99670: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882613.99675: Calling all_plugins_play to load vars for managed_node1 15794 1726882613.99680: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882613.99684: Calling groups_plugins_play to load vars for managed_node1 15794 1726882613.99871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882614.00145: done with get_vars() 15794 1726882614.00154: done getting variables 15794 1726882614.00210: in VariableManager get_vars() 15794 1726882614.00219: Calling all_inventory to load vars for managed_node1 15794 1726882614.00221: Calling groups_inventory to load vars for managed_node1 15794 1726882614.00225: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882614.00229: Calling all_plugins_play to load vars for managed_node1 15794 1726882614.00232: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882614.00240: Calling groups_plugins_play to load vars for managed_node1 15794 1726882614.00431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882614.00737: done with get_vars() 15794 1726882614.00750: done queuing things up, now waiting for results queue to drain 15794 1726882614.00752: results queue empty 15794 1726882614.00753: checking for any_errors_fatal 15794 1726882614.00755: done checking for any_errors_fatal 15794 1726882614.00756: checking for max_fail_percentage 15794 1726882614.00757: done checking for max_fail_percentage 15794 1726882614.00758: checking to see if all hosts have failed and the running result is not ok 15794 1726882614.00759: done checking to see if all hosts have failed 15794 1726882614.00760: getting the remaining hosts for this loop 15794 1726882614.00761: done getting the remaining hosts for this loop 15794 1726882614.00764: getting the next task for host managed_node1 15794 1726882614.00767: done getting next task for host managed_node1 15794 1726882614.00768: ^ task is: None 15794 1726882614.00770: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882614.00771: done queuing things up, now waiting for results queue to drain 15794 1726882614.00772: results queue empty 15794 1726882614.00773: checking for any_errors_fatal 15794 1726882614.00774: done checking for any_errors_fatal 15794 1726882614.00775: checking for max_fail_percentage 15794 1726882614.00776: done checking for max_fail_percentage 15794 1726882614.00777: checking to see if all hosts have failed and the running result is not ok 15794 1726882614.00780: done checking to see if all hosts have failed 15794 1726882614.00781: getting the next task for host managed_node1 15794 1726882614.00784: done getting next task for host managed_node1 15794 1726882614.00785: ^ task is: None 15794 1726882614.00786: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882614.00842: in VariableManager get_vars() 15794 1726882614.00869: done with get_vars() 15794 1726882614.00876: in VariableManager get_vars() 15794 1726882614.00897: done with get_vars() 15794 1726882614.00903: variable 'omit' from source: magic vars 15794 1726882614.00941: in VariableManager get_vars() 15794 1726882614.00957: done with get_vars() 15794 1726882614.01045: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 15794 1726882614.02070: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882614.02098: getting the remaining hosts for this loop 15794 1726882614.02099: done getting the remaining hosts for this loop 15794 1726882614.02102: getting the next task for host managed_node1 15794 1726882614.02105: done getting next task for host managed_node1 15794 1726882614.02107: ^ task is: TASK: Gathering Facts 15794 1726882614.02109: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882614.02111: getting variables 15794 1726882614.02112: in VariableManager get_vars() 15794 1726882614.02125: Calling all_inventory to load vars for managed_node1 15794 1726882614.02127: Calling groups_inventory to load vars for managed_node1 15794 1726882614.02130: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882614.02138: Calling all_plugins_play to load vars for managed_node1 15794 1726882614.02142: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882614.02146: Calling groups_plugins_play to load vars for managed_node1 15794 1726882614.02348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882614.02653: done with get_vars() 15794 1726882614.02663: done getting variables 15794 1726882614.02715: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 21:36:54 -0400 (0:00:00.074) 0:00:11.585 ****** 15794 1726882614.02743: entering _queue_task() for managed_node1/gather_facts 15794 1726882614.03031: worker is 1 (out of 1 available) 15794 1726882614.03048: exiting _queue_task() for managed_node1/gather_facts 15794 1726882614.03063: done queuing things up, now waiting for results queue to drain 15794 1726882614.03065: waiting for pending results... 15794 1726882614.03454: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882614.03462: in run() - task 0affe814-3a2d-94e5-e48f-000000000237 15794 1726882614.03486: variable 'ansible_search_path' from source: unknown 15794 1726882614.03529: calling self._execute() 15794 1726882614.03631: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882614.03649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882614.03670: variable 'omit' from source: magic vars 15794 1726882614.04170: variable 'ansible_distribution_major_version' from source: facts 15794 1726882614.04204: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882614.04216: variable 'omit' from source: magic vars 15794 1726882614.04251: variable 'omit' from source: magic vars 15794 1726882614.04297: variable 'omit' from source: magic vars 15794 1726882614.04350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882614.04393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882614.04426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882614.04454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882614.04470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882614.04511: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882614.04521: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882614.04537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882614.04661: Set connection var ansible_connection to ssh 15794 1726882614.04676: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882614.04689: Set connection var ansible_pipelining to False 15794 1726882614.04701: Set connection var ansible_shell_executable to /bin/sh 15794 1726882614.04751: Set connection var ansible_shell_type to sh 15794 1726882614.04754: Set connection var ansible_timeout to 10 15794 1726882614.04764: variable 'ansible_shell_executable' from source: unknown 15794 1726882614.04773: variable 'ansible_connection' from source: unknown 15794 1726882614.04780: variable 'ansible_module_compression' from source: unknown 15794 1726882614.04788: variable 'ansible_shell_type' from source: unknown 15794 1726882614.04795: variable 'ansible_shell_executable' from source: unknown 15794 1726882614.04802: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882614.04811: variable 'ansible_pipelining' from source: unknown 15794 1726882614.04818: variable 'ansible_timeout' from source: unknown 15794 1726882614.04827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882614.05076: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882614.05080: variable 'omit' from source: magic vars 15794 1726882614.05083: starting attempt loop 15794 1726882614.05085: running the handler 15794 1726882614.05098: variable 'ansible_facts' from source: unknown 15794 1726882614.05121: _low_level_execute_command(): starting 15794 1726882614.05137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882614.05893: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882614.05910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882614.05931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882614.05986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882614.06066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882614.06113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882614.06137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882614.06160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882614.06257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882614.08039: stdout chunk (state=3): >>>/root <<< 15794 1726882614.08108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882614.08200: stderr chunk (state=3): >>><<< 15794 1726882614.08204: stdout chunk (state=3): >>><<< 15794 1726882614.08229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882614.08541: _low_level_execute_command(): starting 15794 1726882614.08545: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457 `" && echo ansible-tmp-1726882614.0823977-16207-16602295795457="` echo /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457 `" ) && sleep 0' 15794 1726882614.08891: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882614.08906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882614.08923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882614.08950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882614.09053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882614.09083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882614.09102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882614.09188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882614.11205: stdout chunk (state=3): >>>ansible-tmp-1726882614.0823977-16207-16602295795457=/root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457 <<< 15794 1726882614.11320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882614.11405: stderr chunk (state=3): >>><<< 15794 1726882614.11420: stdout chunk (state=3): >>><<< 15794 1726882614.11448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882614.0823977-16207-16602295795457=/root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882614.11491: variable 'ansible_module_compression' from source: unknown 15794 1726882614.11560: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882614.11627: variable 'ansible_facts' from source: unknown 15794 1726882614.11826: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py 15794 1726882614.12027: Sending initial data 15794 1726882614.12030: Sent initial data (153 bytes) 15794 1726882614.12680: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882614.12760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882614.12781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882614.12804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882614.12892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882614.14533: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882614.14609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882614.14693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp0qt3h9bm /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py <<< 15794 1726882614.14697: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py" <<< 15794 1726882614.14773: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp0qt3h9bm" to remote "/root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py" <<< 15794 1726882614.17210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882614.17319: stderr chunk (state=3): >>><<< 15794 1726882614.17332: stdout chunk (state=3): >>><<< 15794 1726882614.17367: done transferring module to remote 15794 1726882614.17384: _low_level_execute_command(): starting 15794 1726882614.17397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/ /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py && sleep 0' 15794 1726882614.18131: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882614.18153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882614.18170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882614.18187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882614.18282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882614.20222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882614.20242: stdout chunk (state=3): >>><<< 15794 1726882614.20255: stderr chunk (state=3): >>><<< 15794 1726882614.20278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882614.20294: _low_level_execute_command(): starting 15794 1726882614.20305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/AnsiballZ_setup.py && sleep 0' 15794 1726882614.20921: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882614.20941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882614.21057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882614.21072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882614.21090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882614.21354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882615.91555: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.48291015625, "5m": 0.4345703125, "15m": 0.212890625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2805, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 912, "free": 2805}, "nocache": {"free": 3410, "used": 307}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 568, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205353472, "block_size": 4096, "block_total": 64483404, "block_available": 61329432, "block_used": 3153972, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "peerlsr27", "eth0", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b471:fa1a:61d2:e391", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391", "fe80::bb10:9a17:6b35:7604"]}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "55", "epoch": "1726882615", "epoch_int": "1726882615", "date": "2024-09-20", "time": "21:36:55", "iso8601_micro": "2024-09-21T01:36:55.911294Z", "iso8601": "2024-09-21T01:36:55Z", "iso8601_basic": "20240920T213655911294", "iso8601_basic_short": "20240920T213655", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882615.94343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882615.94347: stdout chunk (state=3): >>><<< 15794 1726882615.94350: stderr chunk (state=3): >>><<< 15794 1726882615.94353: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.48291015625, "5m": 0.4345703125, "15m": 0.212890625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2805, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 912, "free": 2805}, "nocache": {"free": 3410, "used": 307}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 568, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205353472, "block_size": 4096, "block_total": 64483404, "block_available": 61329432, "block_used": 3153972, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "peerlsr27", "eth0", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b471:fa1a:61d2:e391", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391", "fe80::bb10:9a17:6b35:7604"]}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "55", "epoch": "1726882615", "epoch_int": "1726882615", "date": "2024-09-20", "time": "21:36:55", "iso8601_micro": "2024-09-21T01:36:55.911294Z", "iso8601": "2024-09-21T01:36:55Z", "iso8601_basic": "20240920T213655911294", "iso8601_basic_short": "20240920T213655", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882615.95563: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882615.95641: _low_level_execute_command(): starting 15794 1726882615.95645: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882614.0823977-16207-16602295795457/ > /dev/null 2>&1 && sleep 0' 15794 1726882615.96866: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882615.96886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882615.97181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882615.97197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882615.97247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882615.99356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882615.99364: stderr chunk (state=3): >>><<< 15794 1726882615.99367: stdout chunk (state=3): >>><<< 15794 1726882615.99370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882615.99372: handler run complete 15794 1726882615.99907: variable 'ansible_facts' from source: unknown 15794 1726882616.00089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.00928: variable 'ansible_facts' from source: unknown 15794 1726882616.01233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.01618: attempt loop complete, returning result 15794 1726882616.01840: _execute() done 15794 1726882616.01844: dumping result to json 15794 1726882616.01852: done dumping result, returning 15794 1726882616.01855: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-000000000237] 15794 1726882616.01857: sending task result for task 0affe814-3a2d-94e5-e48f-000000000237 ok: [managed_node1] 15794 1726882616.03543: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000237 15794 1726882616.03547: WORKER PROCESS EXITING 15794 1726882616.03550: no more pending results, returning what we have 15794 1726882616.03554: results queue empty 15794 1726882616.03555: checking for any_errors_fatal 15794 1726882616.03557: done checking for any_errors_fatal 15794 1726882616.03558: checking for max_fail_percentage 15794 1726882616.03560: done checking for max_fail_percentage 15794 1726882616.03561: checking to see if all hosts have failed and the running result is not ok 15794 1726882616.03562: done checking to see if all hosts have failed 15794 1726882616.03563: getting the remaining hosts for this loop 15794 1726882616.03565: done getting the remaining hosts for this loop 15794 1726882616.03569: getting the next task for host managed_node1 15794 1726882616.03575: done getting next task for host managed_node1 15794 1726882616.03577: ^ task is: TASK: meta (flush_handlers) 15794 1726882616.03580: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882616.03584: getting variables 15794 1726882616.03585: in VariableManager get_vars() 15794 1726882616.03618: Calling all_inventory to load vars for managed_node1 15794 1726882616.03622: Calling groups_inventory to load vars for managed_node1 15794 1726882616.03625: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.03841: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.03846: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.03852: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.04265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.04969: done with get_vars() 15794 1726882616.04986: done getting variables 15794 1726882616.05071: in VariableManager get_vars() 15794 1726882616.05088: Calling all_inventory to load vars for managed_node1 15794 1726882616.05091: Calling groups_inventory to load vars for managed_node1 15794 1726882616.05094: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.05100: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.05103: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.05107: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.05652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.05972: done with get_vars() 15794 1726882616.05991: done queuing things up, now waiting for results queue to drain 15794 1726882616.05993: results queue empty 15794 1726882616.05994: checking for any_errors_fatal 15794 1726882616.06000: done checking for any_errors_fatal 15794 1726882616.06001: checking for max_fail_percentage 15794 1726882616.06003: done checking for max_fail_percentage 15794 1726882616.06004: checking to see if all hosts have failed and the running result is not ok 15794 1726882616.06005: done checking to see if all hosts have failed 15794 1726882616.06016: getting the remaining hosts for this loop 15794 1726882616.06018: done getting the remaining hosts for this loop 15794 1726882616.06022: getting the next task for host managed_node1 15794 1726882616.06027: done getting next task for host managed_node1 15794 1726882616.06031: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15794 1726882616.06033: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882616.06048: getting variables 15794 1726882616.06049: in VariableManager get_vars() 15794 1726882616.06077: Calling all_inventory to load vars for managed_node1 15794 1726882616.06080: Calling groups_inventory to load vars for managed_node1 15794 1726882616.06083: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.06090: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.06094: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.06097: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.06304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.06580: done with get_vars() 15794 1726882616.06590: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:56 -0400 (0:00:02.039) 0:00:13.624 ****** 15794 1726882616.06675: entering _queue_task() for managed_node1/include_tasks 15794 1726882616.07372: worker is 1 (out of 1 available) 15794 1726882616.07383: exiting _queue_task() for managed_node1/include_tasks 15794 1726882616.07394: done queuing things up, now waiting for results queue to drain 15794 1726882616.07395: waiting for pending results... 15794 1726882616.07469: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15794 1726882616.07588: in run() - task 0affe814-3a2d-94e5-e48f-000000000019 15794 1726882616.07611: variable 'ansible_search_path' from source: unknown 15794 1726882616.07624: variable 'ansible_search_path' from source: unknown 15794 1726882616.07671: calling self._execute() 15794 1726882616.07909: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.07923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.07943: variable 'omit' from source: magic vars 15794 1726882616.08652: variable 'ansible_distribution_major_version' from source: facts 15794 1726882616.08656: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882616.08658: _execute() done 15794 1726882616.08661: dumping result to json 15794 1726882616.08663: done dumping result, returning 15794 1726882616.08666: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-94e5-e48f-000000000019] 15794 1726882616.08669: sending task result for task 0affe814-3a2d-94e5-e48f-000000000019 15794 1726882616.08948: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000019 15794 1726882616.08952: WORKER PROCESS EXITING 15794 1726882616.08998: no more pending results, returning what we have 15794 1726882616.09004: in VariableManager get_vars() 15794 1726882616.09057: Calling all_inventory to load vars for managed_node1 15794 1726882616.09061: Calling groups_inventory to load vars for managed_node1 15794 1726882616.09064: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.09079: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.09083: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.09087: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.10201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.10500: done with get_vars() 15794 1726882616.10509: variable 'ansible_search_path' from source: unknown 15794 1726882616.10510: variable 'ansible_search_path' from source: unknown 15794 1726882616.10543: we have included files to process 15794 1726882616.10545: generating all_blocks data 15794 1726882616.10546: done generating all_blocks data 15794 1726882616.10547: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882616.10548: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882616.10551: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882616.11395: done processing included file 15794 1726882616.11397: iterating over new_blocks loaded from include file 15794 1726882616.11399: in VariableManager get_vars() 15794 1726882616.11423: done with get_vars() 15794 1726882616.11425: filtering new block on tags 15794 1726882616.11446: done filtering new block on tags 15794 1726882616.11449: in VariableManager get_vars() 15794 1726882616.11472: done with get_vars() 15794 1726882616.11474: filtering new block on tags 15794 1726882616.11496: done filtering new block on tags 15794 1726882616.11499: in VariableManager get_vars() 15794 1726882616.11521: done with get_vars() 15794 1726882616.11523: filtering new block on tags 15794 1726882616.11544: done filtering new block on tags 15794 1726882616.11547: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15794 1726882616.11552: extending task lists for all hosts with included blocks 15794 1726882616.12043: done extending task lists 15794 1726882616.12045: done processing included files 15794 1726882616.12046: results queue empty 15794 1726882616.12046: checking for any_errors_fatal 15794 1726882616.12048: done checking for any_errors_fatal 15794 1726882616.12049: checking for max_fail_percentage 15794 1726882616.12051: done checking for max_fail_percentage 15794 1726882616.12051: checking to see if all hosts have failed and the running result is not ok 15794 1726882616.12053: done checking to see if all hosts have failed 15794 1726882616.12054: getting the remaining hosts for this loop 15794 1726882616.12055: done getting the remaining hosts for this loop 15794 1726882616.12058: getting the next task for host managed_node1 15794 1726882616.12063: done getting next task for host managed_node1 15794 1726882616.12065: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15794 1726882616.12068: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882616.12079: getting variables 15794 1726882616.12081: in VariableManager get_vars() 15794 1726882616.12098: Calling all_inventory to load vars for managed_node1 15794 1726882616.12101: Calling groups_inventory to load vars for managed_node1 15794 1726882616.12104: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.12110: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.12113: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.12116: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.12345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.12745: done with get_vars() 15794 1726882616.12756: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:56 -0400 (0:00:00.061) 0:00:13.686 ****** 15794 1726882616.12838: entering _queue_task() for managed_node1/setup 15794 1726882616.13343: worker is 1 (out of 1 available) 15794 1726882616.13353: exiting _queue_task() for managed_node1/setup 15794 1726882616.13364: done queuing things up, now waiting for results queue to drain 15794 1726882616.13366: waiting for pending results... 15794 1726882616.13509: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15794 1726882616.13625: in run() - task 0affe814-3a2d-94e5-e48f-000000000279 15794 1726882616.13650: variable 'ansible_search_path' from source: unknown 15794 1726882616.13659: variable 'ansible_search_path' from source: unknown 15794 1726882616.13706: calling self._execute() 15794 1726882616.13816: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.13832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.13852: variable 'omit' from source: magic vars 15794 1726882616.14391: variable 'ansible_distribution_major_version' from source: facts 15794 1726882616.14470: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882616.15008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882616.18331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882616.18449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882616.18498: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882616.18595: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882616.18723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882616.18987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882616.18991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882616.19040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882616.19113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882616.19137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882616.19238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882616.19299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882616.19344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882616.19398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882616.19423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882616.19637: variable '__network_required_facts' from source: role '' defaults 15794 1726882616.19656: variable 'ansible_facts' from source: unknown 15794 1726882616.19778: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15794 1726882616.19788: when evaluation is False, skipping this task 15794 1726882616.19796: _execute() done 15794 1726882616.19803: dumping result to json 15794 1726882616.19860: done dumping result, returning 15794 1726882616.19864: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-94e5-e48f-000000000279] 15794 1726882616.19867: sending task result for task 0affe814-3a2d-94e5-e48f-000000000279 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882616.20095: no more pending results, returning what we have 15794 1726882616.20100: results queue empty 15794 1726882616.20102: checking for any_errors_fatal 15794 1726882616.20104: done checking for any_errors_fatal 15794 1726882616.20105: checking for max_fail_percentage 15794 1726882616.20107: done checking for max_fail_percentage 15794 1726882616.20108: checking to see if all hosts have failed and the running result is not ok 15794 1726882616.20109: done checking to see if all hosts have failed 15794 1726882616.20110: getting the remaining hosts for this loop 15794 1726882616.20112: done getting the remaining hosts for this loop 15794 1726882616.20117: getting the next task for host managed_node1 15794 1726882616.20129: done getting next task for host managed_node1 15794 1726882616.20137: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15794 1726882616.20142: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882616.20159: getting variables 15794 1726882616.20161: in VariableManager get_vars() 15794 1726882616.20207: Calling all_inventory to load vars for managed_node1 15794 1726882616.20211: Calling groups_inventory to load vars for managed_node1 15794 1726882616.20214: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.20226: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.20230: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.20237: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.20670: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000279 15794 1726882616.20673: WORKER PROCESS EXITING 15794 1726882616.20700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.21051: done with get_vars() 15794 1726882616.21064: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:56 -0400 (0:00:00.083) 0:00:13.769 ****** 15794 1726882616.21172: entering _queue_task() for managed_node1/stat 15794 1726882616.21515: worker is 1 (out of 1 available) 15794 1726882616.21529: exiting _queue_task() for managed_node1/stat 15794 1726882616.21546: done queuing things up, now waiting for results queue to drain 15794 1726882616.21548: waiting for pending results... 15794 1726882616.21815: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15794 1726882616.21968: in run() - task 0affe814-3a2d-94e5-e48f-00000000027b 15794 1726882616.21992: variable 'ansible_search_path' from source: unknown 15794 1726882616.22000: variable 'ansible_search_path' from source: unknown 15794 1726882616.22044: calling self._execute() 15794 1726882616.22150: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.22164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.22188: variable 'omit' from source: magic vars 15794 1726882616.22630: variable 'ansible_distribution_major_version' from source: facts 15794 1726882616.22650: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882616.22865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882616.23198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882616.23255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882616.23309: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882616.23356: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882616.23472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882616.23514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882616.23564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882616.23630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882616.24142: variable '__network_is_ostree' from source: set_fact 15794 1726882616.24146: Evaluated conditional (not __network_is_ostree is defined): False 15794 1726882616.24149: when evaluation is False, skipping this task 15794 1726882616.24151: _execute() done 15794 1726882616.24154: dumping result to json 15794 1726882616.24156: done dumping result, returning 15794 1726882616.24159: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-94e5-e48f-00000000027b] 15794 1726882616.24161: sending task result for task 0affe814-3a2d-94e5-e48f-00000000027b 15794 1726882616.24339: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000027b 15794 1726882616.24343: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15794 1726882616.24402: no more pending results, returning what we have 15794 1726882616.24407: results queue empty 15794 1726882616.24408: checking for any_errors_fatal 15794 1726882616.24416: done checking for any_errors_fatal 15794 1726882616.24417: checking for max_fail_percentage 15794 1726882616.24419: done checking for max_fail_percentage 15794 1726882616.24420: checking to see if all hosts have failed and the running result is not ok 15794 1726882616.24421: done checking to see if all hosts have failed 15794 1726882616.24422: getting the remaining hosts for this loop 15794 1726882616.24424: done getting the remaining hosts for this loop 15794 1726882616.24429: getting the next task for host managed_node1 15794 1726882616.24443: done getting next task for host managed_node1 15794 1726882616.24447: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15794 1726882616.24451: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882616.24466: getting variables 15794 1726882616.24469: in VariableManager get_vars() 15794 1726882616.24512: Calling all_inventory to load vars for managed_node1 15794 1726882616.24516: Calling groups_inventory to load vars for managed_node1 15794 1726882616.24519: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.24529: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.24532: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.24744: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.25164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.25576: done with get_vars() 15794 1726882616.25600: done getting variables 15794 1726882616.25673: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:56 -0400 (0:00:00.045) 0:00:13.815 ****** 15794 1726882616.25721: entering _queue_task() for managed_node1/set_fact 15794 1726882616.26262: worker is 1 (out of 1 available) 15794 1726882616.26273: exiting _queue_task() for managed_node1/set_fact 15794 1726882616.26288: done queuing things up, now waiting for results queue to drain 15794 1726882616.26289: waiting for pending results... 15794 1726882616.26390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15794 1726882616.26555: in run() - task 0affe814-3a2d-94e5-e48f-00000000027c 15794 1726882616.26580: variable 'ansible_search_path' from source: unknown 15794 1726882616.26627: variable 'ansible_search_path' from source: unknown 15794 1726882616.26646: calling self._execute() 15794 1726882616.26763: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.26778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.26844: variable 'omit' from source: magic vars 15794 1726882616.27260: variable 'ansible_distribution_major_version' from source: facts 15794 1726882616.27290: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882616.27502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882616.27917: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882616.27986: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882616.28046: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882616.28090: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882616.28261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882616.28268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882616.28289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882616.28328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882616.28453: variable '__network_is_ostree' from source: set_fact 15794 1726882616.28467: Evaluated conditional (not __network_is_ostree is defined): False 15794 1726882616.28481: when evaluation is False, skipping this task 15794 1726882616.28495: _execute() done 15794 1726882616.28504: dumping result to json 15794 1726882616.28539: done dumping result, returning 15794 1726882616.28542: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-94e5-e48f-00000000027c] 15794 1726882616.28554: sending task result for task 0affe814-3a2d-94e5-e48f-00000000027c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15794 1726882616.28838: no more pending results, returning what we have 15794 1726882616.28843: results queue empty 15794 1726882616.28844: checking for any_errors_fatal 15794 1726882616.28851: done checking for any_errors_fatal 15794 1726882616.28852: checking for max_fail_percentage 15794 1726882616.28855: done checking for max_fail_percentage 15794 1726882616.28856: checking to see if all hosts have failed and the running result is not ok 15794 1726882616.28857: done checking to see if all hosts have failed 15794 1726882616.28858: getting the remaining hosts for this loop 15794 1726882616.28860: done getting the remaining hosts for this loop 15794 1726882616.28865: getting the next task for host managed_node1 15794 1726882616.28876: done getting next task for host managed_node1 15794 1726882616.28881: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15794 1726882616.28885: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882616.28903: getting variables 15794 1726882616.28905: in VariableManager get_vars() 15794 1726882616.29063: Calling all_inventory to load vars for managed_node1 15794 1726882616.29067: Calling groups_inventory to load vars for managed_node1 15794 1726882616.29071: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882616.29078: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000027c 15794 1726882616.29082: WORKER PROCESS EXITING 15794 1726882616.29093: Calling all_plugins_play to load vars for managed_node1 15794 1726882616.29098: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882616.29102: Calling groups_plugins_play to load vars for managed_node1 15794 1726882616.29563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882616.29895: done with get_vars() 15794 1726882616.29908: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:56 -0400 (0:00:00.043) 0:00:13.858 ****** 15794 1726882616.30050: entering _queue_task() for managed_node1/service_facts 15794 1726882616.30052: Creating lock for service_facts 15794 1726882616.30492: worker is 1 (out of 1 available) 15794 1726882616.30504: exiting _queue_task() for managed_node1/service_facts 15794 1726882616.30517: done queuing things up, now waiting for results queue to drain 15794 1726882616.30519: waiting for pending results... 15794 1726882616.30696: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15794 1726882616.30894: in run() - task 0affe814-3a2d-94e5-e48f-00000000027e 15794 1726882616.30921: variable 'ansible_search_path' from source: unknown 15794 1726882616.30931: variable 'ansible_search_path' from source: unknown 15794 1726882616.30980: calling self._execute() 15794 1726882616.31084: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.31098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.31114: variable 'omit' from source: magic vars 15794 1726882616.31568: variable 'ansible_distribution_major_version' from source: facts 15794 1726882616.31587: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882616.31598: variable 'omit' from source: magic vars 15794 1726882616.31681: variable 'omit' from source: magic vars 15794 1726882616.31735: variable 'omit' from source: magic vars 15794 1726882616.31788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882616.31837: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882616.31886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882616.31895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882616.31911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882616.31995: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882616.31999: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.32002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.32107: Set connection var ansible_connection to ssh 15794 1726882616.32122: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882616.32136: Set connection var ansible_pipelining to False 15794 1726882616.32153: Set connection var ansible_shell_executable to /bin/sh 15794 1726882616.32161: Set connection var ansible_shell_type to sh 15794 1726882616.32212: Set connection var ansible_timeout to 10 15794 1726882616.32218: variable 'ansible_shell_executable' from source: unknown 15794 1726882616.32227: variable 'ansible_connection' from source: unknown 15794 1726882616.32236: variable 'ansible_module_compression' from source: unknown 15794 1726882616.32244: variable 'ansible_shell_type' from source: unknown 15794 1726882616.32252: variable 'ansible_shell_executable' from source: unknown 15794 1726882616.32322: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882616.32325: variable 'ansible_pipelining' from source: unknown 15794 1726882616.32328: variable 'ansible_timeout' from source: unknown 15794 1726882616.32330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882616.32526: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882616.32551: variable 'omit' from source: magic vars 15794 1726882616.32563: starting attempt loop 15794 1726882616.32570: running the handler 15794 1726882616.32593: _low_level_execute_command(): starting 15794 1726882616.32606: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882616.33387: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882616.33450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882616.33471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882616.33553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882616.33577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882616.33609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882616.33703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882616.35485: stdout chunk (state=3): >>>/root <<< 15794 1726882616.35667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882616.35685: stderr chunk (state=3): >>><<< 15794 1726882616.35700: stdout chunk (state=3): >>><<< 15794 1726882616.35725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882616.35748: _low_level_execute_command(): starting 15794 1726882616.35767: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376 `" && echo ansible-tmp-1726882616.3573215-16280-178055145536376="` echo /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376 `" ) && sleep 0' 15794 1726882616.36510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882616.36556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882616.36577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882616.36616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882616.36695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882616.38765: stdout chunk (state=3): >>>ansible-tmp-1726882616.3573215-16280-178055145536376=/root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376 <<< 15794 1726882616.38927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882616.38930: stdout chunk (state=3): >>><<< 15794 1726882616.38935: stderr chunk (state=3): >>><<< 15794 1726882616.38955: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882616.3573215-16280-178055145536376=/root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882616.39141: variable 'ansible_module_compression' from source: unknown 15794 1726882616.39145: ANSIBALLZ: Using lock for service_facts 15794 1726882616.39147: ANSIBALLZ: Acquiring lock 15794 1726882616.39150: ANSIBALLZ: Lock acquired: 139758818643552 15794 1726882616.39152: ANSIBALLZ: Creating module 15794 1726882616.55583: ANSIBALLZ: Writing module into payload 15794 1726882616.55708: ANSIBALLZ: Writing module 15794 1726882616.55730: ANSIBALLZ: Renaming module 15794 1726882616.55739: ANSIBALLZ: Done creating module 15794 1726882616.55757: variable 'ansible_facts' from source: unknown 15794 1726882616.55946: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py 15794 1726882616.56343: Sending initial data 15794 1726882616.56347: Sent initial data (162 bytes) 15794 1726882616.56681: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882616.56690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882616.56751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882616.56808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882616.56822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882616.56869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882616.57063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882616.58816: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15794 1726882616.58830: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15794 1726882616.58849: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 15794 1726882616.58881: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882616.58940: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882616.59004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpyn19xmjr /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py <<< 15794 1726882616.59007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py" <<< 15794 1726882616.59070: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpyn19xmjr" to remote "/root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py" <<< 15794 1726882616.60417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882616.60421: stdout chunk (state=3): >>><<< 15794 1726882616.60423: stderr chunk (state=3): >>><<< 15794 1726882616.60426: done transferring module to remote 15794 1726882616.60428: _low_level_execute_command(): starting 15794 1726882616.60430: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/ /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py && sleep 0' 15794 1726882616.60977: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882616.60995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882616.61010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882616.61098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882616.61136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882616.61167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882616.61205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882616.61444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882616.63623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882616.63627: stdout chunk (state=3): >>><<< 15794 1726882616.63630: stderr chunk (state=3): >>><<< 15794 1726882616.63633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882616.63640: _low_level_execute_command(): starting 15794 1726882616.63643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/AnsiballZ_service_facts.py && sleep 0' 15794 1726882616.64898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882616.64972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882618.59819: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "s<<< 15794 1726882618.59837: stdout chunk (state=3): >>>tate": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive"<<< 15794 1726882618.60045: stdout chunk (state=3): >>>, "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, <<< 15794 1726882618.60049: stdout chunk (state=3): >>>"serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15794 1726882618.61738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882618.61742: stderr chunk (state=3): >>><<< 15794 1726882618.61747: stdout chunk (state=3): >>><<< 15794 1726882618.61864: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882618.64017: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882618.64214: _low_level_execute_command(): starting 15794 1726882618.64219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882616.3573215-16280-178055145536376/ > /dev/null 2>&1 && sleep 0' 15794 1726882618.65352: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882618.65415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882618.65430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882618.65444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882618.65503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882618.65521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882618.65955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882618.65961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882618.67976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882618.68003: stderr chunk (state=3): >>><<< 15794 1726882618.68006: stdout chunk (state=3): >>><<< 15794 1726882618.68036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882618.68041: handler run complete 15794 1726882618.69198: variable 'ansible_facts' from source: unknown 15794 1726882618.69854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882618.72394: variable 'ansible_facts' from source: unknown 15794 1726882618.73041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882618.73839: attempt loop complete, returning result 15794 1726882618.73956: _execute() done 15794 1726882618.73965: dumping result to json 15794 1726882618.74185: done dumping result, returning 15794 1726882618.74203: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-94e5-e48f-00000000027e] 15794 1726882618.74215: sending task result for task 0affe814-3a2d-94e5-e48f-00000000027e ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882618.79551: no more pending results, returning what we have 15794 1726882618.79554: results queue empty 15794 1726882618.79556: checking for any_errors_fatal 15794 1726882618.79561: done checking for any_errors_fatal 15794 1726882618.79562: checking for max_fail_percentage 15794 1726882618.79564: done checking for max_fail_percentage 15794 1726882618.79565: checking to see if all hosts have failed and the running result is not ok 15794 1726882618.79566: done checking to see if all hosts have failed 15794 1726882618.79567: getting the remaining hosts for this loop 15794 1726882618.79569: done getting the remaining hosts for this loop 15794 1726882618.79573: getting the next task for host managed_node1 15794 1726882618.79582: done getting next task for host managed_node1 15794 1726882618.79586: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15794 1726882618.79591: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882618.79602: getting variables 15794 1726882618.79604: in VariableManager get_vars() 15794 1726882618.79645: Calling all_inventory to load vars for managed_node1 15794 1726882618.79649: Calling groups_inventory to load vars for managed_node1 15794 1726882618.79652: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882618.79665: Calling all_plugins_play to load vars for managed_node1 15794 1726882618.79668: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882618.79672: Calling groups_plugins_play to load vars for managed_node1 15794 1726882618.80384: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000027e 15794 1726882618.80388: WORKER PROCESS EXITING 15794 1726882618.81275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882618.82922: done with get_vars() 15794 1726882618.82945: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:58 -0400 (0:00:02.531) 0:00:16.390 ****** 15794 1726882618.83239: entering _queue_task() for managed_node1/package_facts 15794 1726882618.83241: Creating lock for package_facts 15794 1726882618.83918: worker is 1 (out of 1 available) 15794 1726882618.83931: exiting _queue_task() for managed_node1/package_facts 15794 1726882618.84204: done queuing things up, now waiting for results queue to drain 15794 1726882618.84206: waiting for pending results... 15794 1726882618.84551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15794 1726882618.84817: in run() - task 0affe814-3a2d-94e5-e48f-00000000027f 15794 1726882618.85045: variable 'ansible_search_path' from source: unknown 15794 1726882618.85049: variable 'ansible_search_path' from source: unknown 15794 1726882618.85053: calling self._execute() 15794 1726882618.85297: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882618.85311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882618.85327: variable 'omit' from source: magic vars 15794 1726882618.86256: variable 'ansible_distribution_major_version' from source: facts 15794 1726882618.86260: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882618.86288: variable 'omit' from source: magic vars 15794 1726882618.86398: variable 'omit' from source: magic vars 15794 1726882618.86449: variable 'omit' from source: magic vars 15794 1726882618.86506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882618.86552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882618.86589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882618.86615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882618.86632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882618.86686: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882618.86697: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882618.86705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882618.86860: Set connection var ansible_connection to ssh 15794 1726882618.86864: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882618.86868: Set connection var ansible_pipelining to False 15794 1726882618.86884: Set connection var ansible_shell_executable to /bin/sh 15794 1726882618.86898: Set connection var ansible_shell_type to sh 15794 1726882618.86969: Set connection var ansible_timeout to 10 15794 1726882618.86972: variable 'ansible_shell_executable' from source: unknown 15794 1726882618.86975: variable 'ansible_connection' from source: unknown 15794 1726882618.86977: variable 'ansible_module_compression' from source: unknown 15794 1726882618.86984: variable 'ansible_shell_type' from source: unknown 15794 1726882618.86992: variable 'ansible_shell_executable' from source: unknown 15794 1726882618.87003: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882618.87014: variable 'ansible_pipelining' from source: unknown 15794 1726882618.87022: variable 'ansible_timeout' from source: unknown 15794 1726882618.87031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882618.87295: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882618.87332: variable 'omit' from source: magic vars 15794 1726882618.87338: starting attempt loop 15794 1726882618.87340: running the handler 15794 1726882618.87354: _low_level_execute_command(): starting 15794 1726882618.87405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882618.88216: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882618.88304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882618.88337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882618.88356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882618.88475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882618.90293: stdout chunk (state=3): >>>/root <<< 15794 1726882618.90495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882618.90499: stdout chunk (state=3): >>><<< 15794 1726882618.90501: stderr chunk (state=3): >>><<< 15794 1726882618.90766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882618.90770: _low_level_execute_command(): starting 15794 1726882618.90774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842 `" && echo ansible-tmp-1726882618.9065974-16389-154436734704842="` echo /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842 `" ) && sleep 0' 15794 1726882618.91836: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882618.91917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882618.92092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882618.92150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882618.94246: stdout chunk (state=3): >>>ansible-tmp-1726882618.9065974-16389-154436734704842=/root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842 <<< 15794 1726882618.94451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882618.94740: stderr chunk (state=3): >>><<< 15794 1726882618.94744: stdout chunk (state=3): >>><<< 15794 1726882618.94749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882618.9065974-16389-154436734704842=/root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882618.94752: variable 'ansible_module_compression' from source: unknown 15794 1726882618.94803: ANSIBALLZ: Using lock for package_facts 15794 1726882618.94812: ANSIBALLZ: Acquiring lock 15794 1726882618.95042: ANSIBALLZ: Lock acquired: 139758818654256 15794 1726882618.95045: ANSIBALLZ: Creating module 15794 1726882619.79249: ANSIBALLZ: Writing module into payload 15794 1726882619.79680: ANSIBALLZ: Writing module 15794 1726882619.79725: ANSIBALLZ: Renaming module 15794 1726882619.80163: ANSIBALLZ: Done creating module 15794 1726882619.80167: variable 'ansible_facts' from source: unknown 15794 1726882619.80439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py 15794 1726882619.80966: Sending initial data 15794 1726882619.80977: Sent initial data (162 bytes) 15794 1726882619.82705: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882619.82980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882619.83253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882619.83356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882619.85225: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882619.85279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882619.85355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpxvk6kyo7 /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py <<< 15794 1726882619.85360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py" <<< 15794 1726882619.85394: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpxvk6kyo7" to remote "/root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py" <<< 15794 1726882619.90216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882619.90360: stderr chunk (state=3): >>><<< 15794 1726882619.90364: stdout chunk (state=3): >>><<< 15794 1726882619.90368: done transferring module to remote 15794 1726882619.90492: _low_level_execute_command(): starting 15794 1726882619.90496: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/ /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py && sleep 0' 15794 1726882619.91643: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882619.91646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882619.91649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882619.91693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882619.91825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882619.91957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882619.92050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882619.94094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882619.94103: stdout chunk (state=3): >>><<< 15794 1726882619.94109: stderr chunk (state=3): >>><<< 15794 1726882619.94126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882619.94129: _low_level_execute_command(): starting 15794 1726882619.94138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/AnsiballZ_package_facts.py && sleep 0' 15794 1726882619.95861: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882619.95865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882619.95868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882619.95870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882619.95873: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882619.96109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882619.96117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882619.96179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882619.96332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882620.60098: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 15794 1726882620.60220: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 15794 1726882620.60284: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 15794 1726882620.60330: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 15794 1726882620.60367: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15794 1726882620.62273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882620.62333: stderr chunk (state=3): >>><<< 15794 1726882620.62338: stdout chunk (state=3): >>><<< 15794 1726882620.62372: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882620.69808: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882620.69815: _low_level_execute_command(): starting 15794 1726882620.69820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882618.9065974-16389-154436734704842/ > /dev/null 2>&1 && sleep 0' 15794 1726882620.70341: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882620.70345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882620.70347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882620.70350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882620.70408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882620.70415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882620.70478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882620.72482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882620.72587: stderr chunk (state=3): >>><<< 15794 1726882620.72593: stdout chunk (state=3): >>><<< 15794 1726882620.72726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882620.72747: handler run complete 15794 1726882620.74184: variable 'ansible_facts' from source: unknown 15794 1726882620.74828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882620.78103: variable 'ansible_facts' from source: unknown 15794 1726882620.78814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882620.79945: attempt loop complete, returning result 15794 1726882620.79963: _execute() done 15794 1726882620.79967: dumping result to json 15794 1726882620.80157: done dumping result, returning 15794 1726882620.80167: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-94e5-e48f-00000000027f] 15794 1726882620.80170: sending task result for task 0affe814-3a2d-94e5-e48f-00000000027f 15794 1726882620.85129: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000027f 15794 1726882620.85133: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882620.85237: no more pending results, returning what we have 15794 1726882620.85240: results queue empty 15794 1726882620.85241: checking for any_errors_fatal 15794 1726882620.85248: done checking for any_errors_fatal 15794 1726882620.85249: checking for max_fail_percentage 15794 1726882620.85251: done checking for max_fail_percentage 15794 1726882620.85252: checking to see if all hosts have failed and the running result is not ok 15794 1726882620.85253: done checking to see if all hosts have failed 15794 1726882620.85254: getting the remaining hosts for this loop 15794 1726882620.85255: done getting the remaining hosts for this loop 15794 1726882620.85259: getting the next task for host managed_node1 15794 1726882620.85266: done getting next task for host managed_node1 15794 1726882620.85270: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15794 1726882620.85272: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882620.85282: getting variables 15794 1726882620.85284: in VariableManager get_vars() 15794 1726882620.85317: Calling all_inventory to load vars for managed_node1 15794 1726882620.85320: Calling groups_inventory to load vars for managed_node1 15794 1726882620.85323: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882620.85333: Calling all_plugins_play to load vars for managed_node1 15794 1726882620.85338: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882620.85343: Calling groups_plugins_play to load vars for managed_node1 15794 1726882620.88463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882620.92645: done with get_vars() 15794 1726882620.92684: done getting variables 15794 1726882620.92810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:00 -0400 (0:00:02.096) 0:00:18.486 ****** 15794 1726882620.92963: entering _queue_task() for managed_node1/debug 15794 1726882620.94074: worker is 1 (out of 1 available) 15794 1726882620.94088: exiting _queue_task() for managed_node1/debug 15794 1726882620.94101: done queuing things up, now waiting for results queue to drain 15794 1726882620.94103: waiting for pending results... 15794 1726882620.95406: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15794 1726882620.95724: in run() - task 0affe814-3a2d-94e5-e48f-00000000001a 15794 1726882620.95728: variable 'ansible_search_path' from source: unknown 15794 1726882620.95731: variable 'ansible_search_path' from source: unknown 15794 1726882620.95858: calling self._execute() 15794 1726882620.96261: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882620.96267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882620.96271: variable 'omit' from source: magic vars 15794 1726882620.97240: variable 'ansible_distribution_major_version' from source: facts 15794 1726882620.97246: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882620.97249: variable 'omit' from source: magic vars 15794 1726882620.97447: variable 'omit' from source: magic vars 15794 1726882620.97692: variable 'network_provider' from source: set_fact 15794 1726882620.97719: variable 'omit' from source: magic vars 15794 1726882620.97896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882620.97931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882620.97992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882620.98060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882620.98084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882620.98198: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882620.98211: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882620.98227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882620.98473: Set connection var ansible_connection to ssh 15794 1726882620.98525: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882620.98661: Set connection var ansible_pipelining to False 15794 1726882620.98664: Set connection var ansible_shell_executable to /bin/sh 15794 1726882620.98667: Set connection var ansible_shell_type to sh 15794 1726882620.98670: Set connection var ansible_timeout to 10 15794 1726882620.98695: variable 'ansible_shell_executable' from source: unknown 15794 1726882620.98740: variable 'ansible_connection' from source: unknown 15794 1726882620.98749: variable 'ansible_module_compression' from source: unknown 15794 1726882620.98758: variable 'ansible_shell_type' from source: unknown 15794 1726882620.98771: variable 'ansible_shell_executable' from source: unknown 15794 1726882620.98780: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882620.98879: variable 'ansible_pipelining' from source: unknown 15794 1726882620.98883: variable 'ansible_timeout' from source: unknown 15794 1726882620.98885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882620.99213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882620.99314: variable 'omit' from source: magic vars 15794 1726882620.99318: starting attempt loop 15794 1726882620.99321: running the handler 15794 1726882620.99410: handler run complete 15794 1726882620.99441: attempt loop complete, returning result 15794 1726882620.99530: _execute() done 15794 1726882620.99533: dumping result to json 15794 1726882620.99538: done dumping result, returning 15794 1726882620.99541: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-94e5-e48f-00000000001a] 15794 1726882620.99543: sending task result for task 0affe814-3a2d-94e5-e48f-00000000001a 15794 1726882620.99840: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000001a 15794 1726882620.99844: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15794 1726882620.99920: no more pending results, returning what we have 15794 1726882620.99924: results queue empty 15794 1726882620.99925: checking for any_errors_fatal 15794 1726882620.99940: done checking for any_errors_fatal 15794 1726882620.99942: checking for max_fail_percentage 15794 1726882620.99944: done checking for max_fail_percentage 15794 1726882620.99945: checking to see if all hosts have failed and the running result is not ok 15794 1726882620.99946: done checking to see if all hosts have failed 15794 1726882620.99947: getting the remaining hosts for this loop 15794 1726882620.99949: done getting the remaining hosts for this loop 15794 1726882620.99955: getting the next task for host managed_node1 15794 1726882620.99962: done getting next task for host managed_node1 15794 1726882620.99968: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15794 1726882620.99971: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882620.99983: getting variables 15794 1726882620.99986: in VariableManager get_vars() 15794 1726882621.00029: Calling all_inventory to load vars for managed_node1 15794 1726882621.00033: Calling groups_inventory to load vars for managed_node1 15794 1726882621.00410: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882621.00421: Calling all_plugins_play to load vars for managed_node1 15794 1726882621.00425: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882621.00429: Calling groups_plugins_play to load vars for managed_node1 15794 1726882621.04868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882621.09929: done with get_vars() 15794 1726882621.09988: done getting variables 15794 1726882621.10069: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:01 -0400 (0:00:00.172) 0:00:18.659 ****** 15794 1726882621.10108: entering _queue_task() for managed_node1/fail 15794 1726882621.10510: worker is 1 (out of 1 available) 15794 1726882621.10525: exiting _queue_task() for managed_node1/fail 15794 1726882621.10650: done queuing things up, now waiting for results queue to drain 15794 1726882621.10658: waiting for pending results... 15794 1726882621.10917: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15794 1726882621.11108: in run() - task 0affe814-3a2d-94e5-e48f-00000000001b 15794 1726882621.11116: variable 'ansible_search_path' from source: unknown 15794 1726882621.11120: variable 'ansible_search_path' from source: unknown 15794 1726882621.11127: calling self._execute() 15794 1726882621.11250: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882621.11262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882621.11277: variable 'omit' from source: magic vars 15794 1726882621.12340: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.12344: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882621.12346: variable 'network_state' from source: role '' defaults 15794 1726882621.12583: Evaluated conditional (network_state != {}): False 15794 1726882621.12587: when evaluation is False, skipping this task 15794 1726882621.12590: _execute() done 15794 1726882621.12593: dumping result to json 15794 1726882621.12595: done dumping result, returning 15794 1726882621.12598: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-94e5-e48f-00000000001b] 15794 1726882621.12601: sending task result for task 0affe814-3a2d-94e5-e48f-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882621.12753: no more pending results, returning what we have 15794 1726882621.12758: results queue empty 15794 1726882621.12760: checking for any_errors_fatal 15794 1726882621.12769: done checking for any_errors_fatal 15794 1726882621.12771: checking for max_fail_percentage 15794 1726882621.12773: done checking for max_fail_percentage 15794 1726882621.12774: checking to see if all hosts have failed and the running result is not ok 15794 1726882621.12775: done checking to see if all hosts have failed 15794 1726882621.12776: getting the remaining hosts for this loop 15794 1726882621.12781: done getting the remaining hosts for this loop 15794 1726882621.12786: getting the next task for host managed_node1 15794 1726882621.12842: done getting next task for host managed_node1 15794 1726882621.12847: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15794 1726882621.12851: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882621.12869: getting variables 15794 1726882621.12872: in VariableManager get_vars() 15794 1726882621.13146: Calling all_inventory to load vars for managed_node1 15794 1726882621.13150: Calling groups_inventory to load vars for managed_node1 15794 1726882621.13154: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882621.13162: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000001b 15794 1726882621.13165: WORKER PROCESS EXITING 15794 1726882621.13181: Calling all_plugins_play to load vars for managed_node1 15794 1726882621.13186: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882621.13191: Calling groups_plugins_play to load vars for managed_node1 15794 1726882621.18386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882621.24129: done with get_vars() 15794 1726882621.24177: done getting variables 15794 1726882621.24658: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:01 -0400 (0:00:00.145) 0:00:18.804 ****** 15794 1726882621.24695: entering _queue_task() for managed_node1/fail 15794 1726882621.25762: worker is 1 (out of 1 available) 15794 1726882621.25776: exiting _queue_task() for managed_node1/fail 15794 1726882621.25793: done queuing things up, now waiting for results queue to drain 15794 1726882621.25795: waiting for pending results... 15794 1726882621.26189: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15794 1726882621.26637: in run() - task 0affe814-3a2d-94e5-e48f-00000000001c 15794 1726882621.26644: variable 'ansible_search_path' from source: unknown 15794 1726882621.26654: variable 'ansible_search_path' from source: unknown 15794 1726882621.26773: calling self._execute() 15794 1726882621.27009: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882621.27013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882621.27018: variable 'omit' from source: magic vars 15794 1726882621.27770: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.27773: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882621.27868: variable 'network_state' from source: role '' defaults 15794 1726882621.27881: Evaluated conditional (network_state != {}): False 15794 1726882621.27887: when evaluation is False, skipping this task 15794 1726882621.27891: _execute() done 15794 1726882621.27896: dumping result to json 15794 1726882621.27901: done dumping result, returning 15794 1726882621.27910: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-94e5-e48f-00000000001c] 15794 1726882621.27919: sending task result for task 0affe814-3a2d-94e5-e48f-00000000001c 15794 1726882621.28053: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000001c 15794 1726882621.28056: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882621.28143: no more pending results, returning what we have 15794 1726882621.28147: results queue empty 15794 1726882621.28149: checking for any_errors_fatal 15794 1726882621.28155: done checking for any_errors_fatal 15794 1726882621.28156: checking for max_fail_percentage 15794 1726882621.28158: done checking for max_fail_percentage 15794 1726882621.28159: checking to see if all hosts have failed and the running result is not ok 15794 1726882621.28160: done checking to see if all hosts have failed 15794 1726882621.28161: getting the remaining hosts for this loop 15794 1726882621.28163: done getting the remaining hosts for this loop 15794 1726882621.28167: getting the next task for host managed_node1 15794 1726882621.28172: done getting next task for host managed_node1 15794 1726882621.28176: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15794 1726882621.28181: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882621.28196: getting variables 15794 1726882621.28198: in VariableManager get_vars() 15794 1726882621.28233: Calling all_inventory to load vars for managed_node1 15794 1726882621.28238: Calling groups_inventory to load vars for managed_node1 15794 1726882621.28241: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882621.28251: Calling all_plugins_play to load vars for managed_node1 15794 1726882621.28254: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882621.28257: Calling groups_plugins_play to load vars for managed_node1 15794 1726882621.31027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882621.35509: done with get_vars() 15794 1726882621.35547: done getting variables 15794 1726882621.35633: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:01 -0400 (0:00:00.109) 0:00:18.914 ****** 15794 1726882621.35675: entering _queue_task() for managed_node1/fail 15794 1726882621.36086: worker is 1 (out of 1 available) 15794 1726882621.36156: exiting _queue_task() for managed_node1/fail 15794 1726882621.36170: done queuing things up, now waiting for results queue to drain 15794 1726882621.36172: waiting for pending results... 15794 1726882621.36556: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15794 1726882621.36561: in run() - task 0affe814-3a2d-94e5-e48f-00000000001d 15794 1726882621.36565: variable 'ansible_search_path' from source: unknown 15794 1726882621.36571: variable 'ansible_search_path' from source: unknown 15794 1726882621.36619: calling self._execute() 15794 1726882621.36940: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882621.36945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882621.36948: variable 'omit' from source: magic vars 15794 1726882621.37628: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.37649: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882621.38224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882621.42959: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882621.43118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882621.43165: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882621.43214: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882621.43246: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882621.43374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882621.43496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882621.43563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.43626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882621.43645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882621.43773: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.43789: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15794 1726882621.43964: variable 'ansible_distribution' from source: facts 15794 1726882621.43967: variable '__network_rh_distros' from source: role '' defaults 15794 1726882621.44039: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15794 1726882621.44044: when evaluation is False, skipping this task 15794 1726882621.44047: _execute() done 15794 1726882621.44050: dumping result to json 15794 1726882621.44053: done dumping result, returning 15794 1726882621.44060: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-94e5-e48f-00000000001d] 15794 1726882621.44063: sending task result for task 0affe814-3a2d-94e5-e48f-00000000001d 15794 1726882621.44139: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000001d 15794 1726882621.44142: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15794 1726882621.44217: no more pending results, returning what we have 15794 1726882621.44222: results queue empty 15794 1726882621.44224: checking for any_errors_fatal 15794 1726882621.44230: done checking for any_errors_fatal 15794 1726882621.44231: checking for max_fail_percentage 15794 1726882621.44233: done checking for max_fail_percentage 15794 1726882621.44236: checking to see if all hosts have failed and the running result is not ok 15794 1726882621.44237: done checking to see if all hosts have failed 15794 1726882621.44238: getting the remaining hosts for this loop 15794 1726882621.44240: done getting the remaining hosts for this loop 15794 1726882621.44247: getting the next task for host managed_node1 15794 1726882621.44255: done getting next task for host managed_node1 15794 1726882621.44260: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15794 1726882621.44263: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882621.44278: getting variables 15794 1726882621.44281: in VariableManager get_vars() 15794 1726882621.44326: Calling all_inventory to load vars for managed_node1 15794 1726882621.44329: Calling groups_inventory to load vars for managed_node1 15794 1726882621.44332: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882621.44564: Calling all_plugins_play to load vars for managed_node1 15794 1726882621.44568: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882621.44572: Calling groups_plugins_play to load vars for managed_node1 15794 1726882621.46843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882621.51016: done with get_vars() 15794 1726882621.51128: done getting variables 15794 1726882621.51298: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:01 -0400 (0:00:00.156) 0:00:19.071 ****** 15794 1726882621.51366: entering _queue_task() for managed_node1/dnf 15794 1726882621.51814: worker is 1 (out of 1 available) 15794 1726882621.51828: exiting _queue_task() for managed_node1/dnf 15794 1726882621.51843: done queuing things up, now waiting for results queue to drain 15794 1726882621.51845: waiting for pending results... 15794 1726882621.52252: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15794 1726882621.52257: in run() - task 0affe814-3a2d-94e5-e48f-00000000001e 15794 1726882621.52277: variable 'ansible_search_path' from source: unknown 15794 1726882621.52289: variable 'ansible_search_path' from source: unknown 15794 1726882621.52341: calling self._execute() 15794 1726882621.52446: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882621.52461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882621.52481: variable 'omit' from source: magic vars 15794 1726882621.52911: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.52929: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882621.53239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882621.55957: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882621.56076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882621.56124: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882621.56185: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882621.56239: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882621.56340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882621.56383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882621.56440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.56486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882621.56539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882621.56656: variable 'ansible_distribution' from source: facts 15794 1726882621.56668: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.56682: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15794 1726882621.56843: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882621.57239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882621.57242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882621.57245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.57254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882621.57275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882621.57328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882621.57365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882621.57403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.57460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882621.57483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882621.57540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882621.57572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882621.57604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.57658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882621.57681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882621.57890: variable 'network_connections' from source: play vars 15794 1726882621.57909: variable 'interface' from source: set_fact 15794 1726882621.57998: variable 'interface' from source: set_fact 15794 1726882621.58241: variable 'interface' from source: set_fact 15794 1726882621.58244: variable 'interface' from source: set_fact 15794 1726882621.58247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882621.58411: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882621.58462: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882621.58905: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882621.58949: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882621.59006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882621.59039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882621.59084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.59122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882621.59196: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882621.59519: variable 'network_connections' from source: play vars 15794 1726882621.59532: variable 'interface' from source: set_fact 15794 1726882621.59611: variable 'interface' from source: set_fact 15794 1726882621.59626: variable 'interface' from source: set_fact 15794 1726882621.59709: variable 'interface' from source: set_fact 15794 1726882621.59754: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882621.59763: when evaluation is False, skipping this task 15794 1726882621.59771: _execute() done 15794 1726882621.59779: dumping result to json 15794 1726882621.59790: done dumping result, returning 15794 1726882621.59804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-00000000001e] 15794 1726882621.59823: sending task result for task 0affe814-3a2d-94e5-e48f-00000000001e 15794 1726882621.59948: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000001e 15794 1726882621.59956: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882621.60021: no more pending results, returning what we have 15794 1726882621.60025: results queue empty 15794 1726882621.60026: checking for any_errors_fatal 15794 1726882621.60037: done checking for any_errors_fatal 15794 1726882621.60038: checking for max_fail_percentage 15794 1726882621.60040: done checking for max_fail_percentage 15794 1726882621.60041: checking to see if all hosts have failed and the running result is not ok 15794 1726882621.60042: done checking to see if all hosts have failed 15794 1726882621.60043: getting the remaining hosts for this loop 15794 1726882621.60045: done getting the remaining hosts for this loop 15794 1726882621.60049: getting the next task for host managed_node1 15794 1726882621.60056: done getting next task for host managed_node1 15794 1726882621.60061: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15794 1726882621.60177: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882621.60196: getting variables 15794 1726882621.60198: in VariableManager get_vars() 15794 1726882621.60299: Calling all_inventory to load vars for managed_node1 15794 1726882621.60302: Calling groups_inventory to load vars for managed_node1 15794 1726882621.60306: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882621.60316: Calling all_plugins_play to load vars for managed_node1 15794 1726882621.60320: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882621.60324: Calling groups_plugins_play to load vars for managed_node1 15794 1726882621.69566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882621.73699: done with get_vars() 15794 1726882621.73790: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15794 1726882621.73890: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:01 -0400 (0:00:00.225) 0:00:19.297 ****** 15794 1726882621.73922: entering _queue_task() for managed_node1/yum 15794 1726882621.73924: Creating lock for yum 15794 1726882621.74409: worker is 1 (out of 1 available) 15794 1726882621.74423: exiting _queue_task() for managed_node1/yum 15794 1726882621.74439: done queuing things up, now waiting for results queue to drain 15794 1726882621.74440: waiting for pending results... 15794 1726882621.74752: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15794 1726882621.75041: in run() - task 0affe814-3a2d-94e5-e48f-00000000001f 15794 1726882621.75045: variable 'ansible_search_path' from source: unknown 15794 1726882621.75048: variable 'ansible_search_path' from source: unknown 15794 1726882621.75052: calling self._execute() 15794 1726882621.75080: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882621.75094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882621.75111: variable 'omit' from source: magic vars 15794 1726882621.75557: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.75578: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882621.75811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882621.79514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882621.79650: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882621.79774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882621.79832: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882621.79896: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882621.80009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882621.80072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882621.80128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882621.80192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882621.80259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882621.80449: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.80473: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15794 1726882621.80486: when evaluation is False, skipping this task 15794 1726882621.80692: _execute() done 15794 1726882621.80696: dumping result to json 15794 1726882621.80699: done dumping result, returning 15794 1726882621.80702: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-00000000001f] 15794 1726882621.80705: sending task result for task 0affe814-3a2d-94e5-e48f-00000000001f 15794 1726882621.80797: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000001f 15794 1726882621.80802: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15794 1726882621.80950: no more pending results, returning what we have 15794 1726882621.80956: results queue empty 15794 1726882621.80957: checking for any_errors_fatal 15794 1726882621.80969: done checking for any_errors_fatal 15794 1726882621.81139: checking for max_fail_percentage 15794 1726882621.81142: done checking for max_fail_percentage 15794 1726882621.81143: checking to see if all hosts have failed and the running result is not ok 15794 1726882621.81144: done checking to see if all hosts have failed 15794 1726882621.81145: getting the remaining hosts for this loop 15794 1726882621.81147: done getting the remaining hosts for this loop 15794 1726882621.81151: getting the next task for host managed_node1 15794 1726882621.81158: done getting next task for host managed_node1 15794 1726882621.81162: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15794 1726882621.81164: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882621.81178: getting variables 15794 1726882621.81180: in VariableManager get_vars() 15794 1726882621.81226: Calling all_inventory to load vars for managed_node1 15794 1726882621.81229: Calling groups_inventory to load vars for managed_node1 15794 1726882621.81232: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882621.81418: Calling all_plugins_play to load vars for managed_node1 15794 1726882621.81422: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882621.81427: Calling groups_plugins_play to load vars for managed_node1 15794 1726882621.84255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882621.90053: done with get_vars() 15794 1726882621.90100: done getting variables 15794 1726882621.90593: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:01 -0400 (0:00:00.167) 0:00:19.464 ****** 15794 1726882621.90628: entering _queue_task() for managed_node1/fail 15794 1726882621.91657: worker is 1 (out of 1 available) 15794 1726882621.91672: exiting _queue_task() for managed_node1/fail 15794 1726882621.91689: done queuing things up, now waiting for results queue to drain 15794 1726882621.91690: waiting for pending results... 15794 1726882621.92056: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15794 1726882621.92094: in run() - task 0affe814-3a2d-94e5-e48f-000000000020 15794 1726882621.92177: variable 'ansible_search_path' from source: unknown 15794 1726882621.92184: variable 'ansible_search_path' from source: unknown 15794 1726882621.92188: calling self._execute() 15794 1726882621.92289: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882621.92304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882621.92321: variable 'omit' from source: magic vars 15794 1726882621.92840: variable 'ansible_distribution_major_version' from source: facts 15794 1726882621.92990: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882621.93304: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882621.93890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882621.99544: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882621.99549: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882621.99852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882621.99897: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882622.00150: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882622.00578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.00583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.00586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.00770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.01043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.01345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.01378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.01415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.01588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.01605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.01778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.01939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.02048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.02222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.02243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.02822: variable 'network_connections' from source: play vars 15794 1726882622.03292: variable 'interface' from source: set_fact 15794 1726882622.03503: variable 'interface' from source: set_fact 15794 1726882622.03513: variable 'interface' from source: set_fact 15794 1726882622.03593: variable 'interface' from source: set_fact 15794 1726882622.04028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882622.04925: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882622.05141: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882622.05292: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882622.05449: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882622.05505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882622.05862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882622.05902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.05936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882622.06136: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882622.07442: variable 'network_connections' from source: play vars 15794 1726882622.07449: variable 'interface' from source: set_fact 15794 1726882622.07921: variable 'interface' from source: set_fact 15794 1726882622.07931: variable 'interface' from source: set_fact 15794 1726882622.08009: variable 'interface' from source: set_fact 15794 1726882622.08275: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882622.08279: when evaluation is False, skipping this task 15794 1726882622.08285: _execute() done 15794 1726882622.08288: dumping result to json 15794 1726882622.08294: done dumping result, returning 15794 1726882622.08303: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000020] 15794 1726882622.08314: sending task result for task 0affe814-3a2d-94e5-e48f-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882622.08484: no more pending results, returning what we have 15794 1726882622.08489: results queue empty 15794 1726882622.08490: checking for any_errors_fatal 15794 1726882622.08498: done checking for any_errors_fatal 15794 1726882622.08499: checking for max_fail_percentage 15794 1726882622.08501: done checking for max_fail_percentage 15794 1726882622.08502: checking to see if all hosts have failed and the running result is not ok 15794 1726882622.08503: done checking to see if all hosts have failed 15794 1726882622.08504: getting the remaining hosts for this loop 15794 1726882622.08506: done getting the remaining hosts for this loop 15794 1726882622.08510: getting the next task for host managed_node1 15794 1726882622.08517: done getting next task for host managed_node1 15794 1726882622.08521: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15794 1726882622.08523: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882622.08540: getting variables 15794 1726882622.08542: in VariableManager get_vars() 15794 1726882622.08588: Calling all_inventory to load vars for managed_node1 15794 1726882622.08592: Calling groups_inventory to load vars for managed_node1 15794 1726882622.08595: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882622.08607: Calling all_plugins_play to load vars for managed_node1 15794 1726882622.08611: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882622.08615: Calling groups_plugins_play to load vars for managed_node1 15794 1726882622.09542: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000020 15794 1726882622.09546: WORKER PROCESS EXITING 15794 1726882622.13395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882622.19326: done with get_vars() 15794 1726882622.19369: done getting variables 15794 1726882622.19640: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:02 -0400 (0:00:00.290) 0:00:19.754 ****** 15794 1726882622.19676: entering _queue_task() for managed_node1/package 15794 1726882622.20232: worker is 1 (out of 1 available) 15794 1726882622.20351: exiting _queue_task() for managed_node1/package 15794 1726882622.20366: done queuing things up, now waiting for results queue to drain 15794 1726882622.20368: waiting for pending results... 15794 1726882622.21043: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15794 1726882622.21547: in run() - task 0affe814-3a2d-94e5-e48f-000000000021 15794 1726882622.21560: variable 'ansible_search_path' from source: unknown 15794 1726882622.21564: variable 'ansible_search_path' from source: unknown 15794 1726882622.21607: calling self._execute() 15794 1726882622.21828: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.21836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.22172: variable 'omit' from source: magic vars 15794 1726882622.23550: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.23564: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882622.24190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882622.25489: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882622.25788: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882622.25830: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882622.26147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882622.26622: variable 'network_packages' from source: role '' defaults 15794 1726882622.27118: variable '__network_provider_setup' from source: role '' defaults 15794 1726882622.27131: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882622.27327: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882622.27345: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882622.27432: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882622.27797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882622.31409: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882622.31503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882622.31565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882622.31614: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882622.31850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882622.31958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.32010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.32051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.32122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.32208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.32220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.32257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.32298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.32364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.32391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.32736: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15794 1726882622.32907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.32945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.33084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.33087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.33090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.33198: variable 'ansible_python' from source: facts 15794 1726882622.33236: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15794 1726882622.33352: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882622.33498: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882622.33752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.33897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.33902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.34005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.34107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.34254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.34445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.34458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.34543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.34630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.35173: variable 'network_connections' from source: play vars 15794 1726882622.35176: variable 'interface' from source: set_fact 15794 1726882622.35261: variable 'interface' from source: set_fact 15794 1726882622.35288: variable 'interface' from source: set_fact 15794 1726882622.35414: variable 'interface' from source: set_fact 15794 1726882622.35521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882622.35562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882622.35614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.35662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882622.35728: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882622.36123: variable 'network_connections' from source: play vars 15794 1726882622.36138: variable 'interface' from source: set_fact 15794 1726882622.36277: variable 'interface' from source: set_fact 15794 1726882622.36296: variable 'interface' from source: set_fact 15794 1726882622.36431: variable 'interface' from source: set_fact 15794 1726882622.36513: variable '__network_packages_default_wireless' from source: role '' defaults 15794 1726882622.36628: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882622.37169: variable 'network_connections' from source: play vars 15794 1726882622.37183: variable 'interface' from source: set_fact 15794 1726882622.37388: variable 'interface' from source: set_fact 15794 1726882622.37391: variable 'interface' from source: set_fact 15794 1726882622.37393: variable 'interface' from source: set_fact 15794 1726882622.37440: variable '__network_packages_default_team' from source: role '' defaults 15794 1726882622.37556: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882622.37973: variable 'network_connections' from source: play vars 15794 1726882622.37987: variable 'interface' from source: set_fact 15794 1726882622.38066: variable 'interface' from source: set_fact 15794 1726882622.38083: variable 'interface' from source: set_fact 15794 1726882622.38174: variable 'interface' from source: set_fact 15794 1726882622.38268: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882622.38352: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882622.38371: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882622.38455: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882622.38793: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15794 1726882622.39751: variable 'network_connections' from source: play vars 15794 1726882622.39755: variable 'interface' from source: set_fact 15794 1726882622.39832: variable 'interface' from source: set_fact 15794 1726882622.40041: variable 'interface' from source: set_fact 15794 1726882622.40044: variable 'interface' from source: set_fact 15794 1726882622.40047: variable 'ansible_distribution' from source: facts 15794 1726882622.40049: variable '__network_rh_distros' from source: role '' defaults 15794 1726882622.40052: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.40054: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15794 1726882622.40286: variable 'ansible_distribution' from source: facts 15794 1726882622.40297: variable '__network_rh_distros' from source: role '' defaults 15794 1726882622.40313: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.40326: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15794 1726882622.40792: variable 'ansible_distribution' from source: facts 15794 1726882622.41039: variable '__network_rh_distros' from source: role '' defaults 15794 1726882622.41044: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.41047: variable 'network_provider' from source: set_fact 15794 1726882622.41049: variable 'ansible_facts' from source: unknown 15794 1726882622.43576: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15794 1726882622.43594: when evaluation is False, skipping this task 15794 1726882622.43604: _execute() done 15794 1726882622.43612: dumping result to json 15794 1726882622.43647: done dumping result, returning 15794 1726882622.43665: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-94e5-e48f-000000000021] 15794 1726882622.43688: sending task result for task 0affe814-3a2d-94e5-e48f-000000000021 15794 1726882622.43989: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000021 15794 1726882622.43992: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15794 1726882622.44128: no more pending results, returning what we have 15794 1726882622.44133: results queue empty 15794 1726882622.44135: checking for any_errors_fatal 15794 1726882622.44145: done checking for any_errors_fatal 15794 1726882622.44146: checking for max_fail_percentage 15794 1726882622.44149: done checking for max_fail_percentage 15794 1726882622.44150: checking to see if all hosts have failed and the running result is not ok 15794 1726882622.44151: done checking to see if all hosts have failed 15794 1726882622.44152: getting the remaining hosts for this loop 15794 1726882622.44154: done getting the remaining hosts for this loop 15794 1726882622.44159: getting the next task for host managed_node1 15794 1726882622.44168: done getting next task for host managed_node1 15794 1726882622.44239: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15794 1726882622.44243: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882622.44260: getting variables 15794 1726882622.44262: in VariableManager get_vars() 15794 1726882622.44464: Calling all_inventory to load vars for managed_node1 15794 1726882622.44468: Calling groups_inventory to load vars for managed_node1 15794 1726882622.44471: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882622.44490: Calling all_plugins_play to load vars for managed_node1 15794 1726882622.44494: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882622.44498: Calling groups_plugins_play to load vars for managed_node1 15794 1726882622.48843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882622.52528: done with get_vars() 15794 1726882622.52658: done getting variables 15794 1726882622.52847: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:02 -0400 (0:00:00.332) 0:00:20.086 ****** 15794 1726882622.52886: entering _queue_task() for managed_node1/package 15794 1726882622.53647: worker is 1 (out of 1 available) 15794 1726882622.53681: exiting _queue_task() for managed_node1/package 15794 1726882622.53694: done queuing things up, now waiting for results queue to drain 15794 1726882622.53696: waiting for pending results... 15794 1726882622.54359: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15794 1726882622.54471: in run() - task 0affe814-3a2d-94e5-e48f-000000000022 15794 1726882622.54537: variable 'ansible_search_path' from source: unknown 15794 1726882622.54543: variable 'ansible_search_path' from source: unknown 15794 1726882622.54546: calling self._execute() 15794 1726882622.54688: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.54692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.54695: variable 'omit' from source: magic vars 15794 1726882622.55160: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.55171: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882622.55279: variable 'network_state' from source: role '' defaults 15794 1726882622.55292: Evaluated conditional (network_state != {}): False 15794 1726882622.55295: when evaluation is False, skipping this task 15794 1726882622.55299: _execute() done 15794 1726882622.55301: dumping result to json 15794 1726882622.55307: done dumping result, returning 15794 1726882622.55317: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-94e5-e48f-000000000022] 15794 1726882622.55323: sending task result for task 0affe814-3a2d-94e5-e48f-000000000022 15794 1726882622.55430: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000022 15794 1726882622.55439: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882622.55497: no more pending results, returning what we have 15794 1726882622.55502: results queue empty 15794 1726882622.55503: checking for any_errors_fatal 15794 1726882622.55511: done checking for any_errors_fatal 15794 1726882622.55512: checking for max_fail_percentage 15794 1726882622.55514: done checking for max_fail_percentage 15794 1726882622.55515: checking to see if all hosts have failed and the running result is not ok 15794 1726882622.55516: done checking to see if all hosts have failed 15794 1726882622.55516: getting the remaining hosts for this loop 15794 1726882622.55519: done getting the remaining hosts for this loop 15794 1726882622.55523: getting the next task for host managed_node1 15794 1726882622.55530: done getting next task for host managed_node1 15794 1726882622.55535: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15794 1726882622.55538: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882622.55568: getting variables 15794 1726882622.55569: in VariableManager get_vars() 15794 1726882622.55605: Calling all_inventory to load vars for managed_node1 15794 1726882622.55608: Calling groups_inventory to load vars for managed_node1 15794 1726882622.55610: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882622.55620: Calling all_plugins_play to load vars for managed_node1 15794 1726882622.55623: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882622.55626: Calling groups_plugins_play to load vars for managed_node1 15794 1726882622.56922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882622.59972: done with get_vars() 15794 1726882622.59997: done getting variables 15794 1726882622.60045: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:02 -0400 (0:00:00.071) 0:00:20.158 ****** 15794 1726882622.60070: entering _queue_task() for managed_node1/package 15794 1726882622.60290: worker is 1 (out of 1 available) 15794 1726882622.60305: exiting _queue_task() for managed_node1/package 15794 1726882622.60318: done queuing things up, now waiting for results queue to drain 15794 1726882622.60319: waiting for pending results... 15794 1726882622.60495: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15794 1726882622.60567: in run() - task 0affe814-3a2d-94e5-e48f-000000000023 15794 1726882622.60580: variable 'ansible_search_path' from source: unknown 15794 1726882622.60586: variable 'ansible_search_path' from source: unknown 15794 1726882622.60618: calling self._execute() 15794 1726882622.60699: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.60706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.60715: variable 'omit' from source: magic vars 15794 1726882622.61036: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.61046: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882622.61150: variable 'network_state' from source: role '' defaults 15794 1726882622.61160: Evaluated conditional (network_state != {}): False 15794 1726882622.61163: when evaluation is False, skipping this task 15794 1726882622.61167: _execute() done 15794 1726882622.61171: dumping result to json 15794 1726882622.61176: done dumping result, returning 15794 1726882622.61187: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-94e5-e48f-000000000023] 15794 1726882622.61193: sending task result for task 0affe814-3a2d-94e5-e48f-000000000023 15794 1726882622.61296: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000023 15794 1726882622.61299: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882622.61360: no more pending results, returning what we have 15794 1726882622.61364: results queue empty 15794 1726882622.61365: checking for any_errors_fatal 15794 1726882622.61370: done checking for any_errors_fatal 15794 1726882622.61371: checking for max_fail_percentage 15794 1726882622.61373: done checking for max_fail_percentage 15794 1726882622.61374: checking to see if all hosts have failed and the running result is not ok 15794 1726882622.61375: done checking to see if all hosts have failed 15794 1726882622.61376: getting the remaining hosts for this loop 15794 1726882622.61377: done getting the remaining hosts for this loop 15794 1726882622.61381: getting the next task for host managed_node1 15794 1726882622.61387: done getting next task for host managed_node1 15794 1726882622.61390: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15794 1726882622.61393: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882622.61406: getting variables 15794 1726882622.61408: in VariableManager get_vars() 15794 1726882622.61444: Calling all_inventory to load vars for managed_node1 15794 1726882622.61447: Calling groups_inventory to load vars for managed_node1 15794 1726882622.61450: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882622.61459: Calling all_plugins_play to load vars for managed_node1 15794 1726882622.61461: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882622.61463: Calling groups_plugins_play to load vars for managed_node1 15794 1726882622.62900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882622.64792: done with get_vars() 15794 1726882622.64813: done getting variables 15794 1726882622.64895: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:02 -0400 (0:00:00.048) 0:00:20.207 ****** 15794 1726882622.64916: entering _queue_task() for managed_node1/service 15794 1726882622.64918: Creating lock for service 15794 1726882622.65157: worker is 1 (out of 1 available) 15794 1726882622.65172: exiting _queue_task() for managed_node1/service 15794 1726882622.65187: done queuing things up, now waiting for results queue to drain 15794 1726882622.65189: waiting for pending results... 15794 1726882622.65370: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15794 1726882622.65450: in run() - task 0affe814-3a2d-94e5-e48f-000000000024 15794 1726882622.65463: variable 'ansible_search_path' from source: unknown 15794 1726882622.65466: variable 'ansible_search_path' from source: unknown 15794 1726882622.65501: calling self._execute() 15794 1726882622.65585: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.65589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.65599: variable 'omit' from source: magic vars 15794 1726882622.65914: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.65924: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882622.66027: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882622.66201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882622.68852: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882622.68937: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882622.68973: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882622.69025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882622.69078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882622.69152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.69180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.69204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.69241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.69255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.69303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.69323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.69349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.69381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.69397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.69433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.69456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.69476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.69547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.69579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.70277: variable 'network_connections' from source: play vars 15794 1726882622.70280: variable 'interface' from source: set_fact 15794 1726882622.70358: variable 'interface' from source: set_fact 15794 1726882622.70383: variable 'interface' from source: set_fact 15794 1726882622.70452: variable 'interface' from source: set_fact 15794 1726882622.70560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882622.70940: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882622.70990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882622.71044: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882622.71136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882622.71139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882622.71162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882622.71195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.71237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882622.71317: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882622.71697: variable 'network_connections' from source: play vars 15794 1726882622.71701: variable 'interface' from source: set_fact 15794 1726882622.71792: variable 'interface' from source: set_fact 15794 1726882622.71804: variable 'interface' from source: set_fact 15794 1726882622.71876: variable 'interface' from source: set_fact 15794 1726882622.71918: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882622.71922: when evaluation is False, skipping this task 15794 1726882622.71925: _execute() done 15794 1726882622.71930: dumping result to json 15794 1726882622.71936: done dumping result, returning 15794 1726882622.71943: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000024] 15794 1726882622.71952: sending task result for task 0affe814-3a2d-94e5-e48f-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882622.72282: no more pending results, returning what we have 15794 1726882622.72286: results queue empty 15794 1726882622.72288: checking for any_errors_fatal 15794 1726882622.72294: done checking for any_errors_fatal 15794 1726882622.72295: checking for max_fail_percentage 15794 1726882622.72298: done checking for max_fail_percentage 15794 1726882622.72299: checking to see if all hosts have failed and the running result is not ok 15794 1726882622.72300: done checking to see if all hosts have failed 15794 1726882622.72300: getting the remaining hosts for this loop 15794 1726882622.72302: done getting the remaining hosts for this loop 15794 1726882622.72306: getting the next task for host managed_node1 15794 1726882622.72312: done getting next task for host managed_node1 15794 1726882622.72316: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15794 1726882622.72318: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882622.72332: getting variables 15794 1726882622.72336: in VariableManager get_vars() 15794 1726882622.72385: Calling all_inventory to load vars for managed_node1 15794 1726882622.72389: Calling groups_inventory to load vars for managed_node1 15794 1726882622.72392: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882622.72402: Calling all_plugins_play to load vars for managed_node1 15794 1726882622.72405: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882622.72409: Calling groups_plugins_play to load vars for managed_node1 15794 1726882622.72962: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000024 15794 1726882622.72966: WORKER PROCESS EXITING 15794 1726882622.76641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882622.80160: done with get_vars() 15794 1726882622.80198: done getting variables 15794 1726882622.80275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:02 -0400 (0:00:00.153) 0:00:20.361 ****** 15794 1726882622.80306: entering _queue_task() for managed_node1/service 15794 1726882622.80710: worker is 1 (out of 1 available) 15794 1726882622.80725: exiting _queue_task() for managed_node1/service 15794 1726882622.80740: done queuing things up, now waiting for results queue to drain 15794 1726882622.80742: waiting for pending results... 15794 1726882622.81012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15794 1726882622.81111: in run() - task 0affe814-3a2d-94e5-e48f-000000000025 15794 1726882622.81129: variable 'ansible_search_path' from source: unknown 15794 1726882622.81132: variable 'ansible_search_path' from source: unknown 15794 1726882622.81217: calling self._execute() 15794 1726882622.81269: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.81279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.81294: variable 'omit' from source: magic vars 15794 1726882622.81717: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.81730: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882622.82062: variable 'network_provider' from source: set_fact 15794 1726882622.82066: variable 'network_state' from source: role '' defaults 15794 1726882622.82074: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15794 1726882622.82077: variable 'omit' from source: magic vars 15794 1726882622.82082: variable 'omit' from source: magic vars 15794 1726882622.82105: variable 'network_service_name' from source: role '' defaults 15794 1726882622.82190: variable 'network_service_name' from source: role '' defaults 15794 1726882622.82330: variable '__network_provider_setup' from source: role '' defaults 15794 1726882622.82339: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882622.82432: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882622.82443: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882622.82527: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882622.82892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882622.85502: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882622.85563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882622.85602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882622.85646: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882622.85688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882622.85802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.85806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.85842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.86017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.86021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.86024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.86026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.86102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.86151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.86155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.86468: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15794 1726882622.86617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.86646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.86685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.86769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.86776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.86896: variable 'ansible_python' from source: facts 15794 1726882622.86904: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15794 1726882622.87025: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882622.87111: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882622.87274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.87331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.87346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.87486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.87491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.87495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882622.87571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882622.87585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.87613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882622.87670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882622.87808: variable 'network_connections' from source: play vars 15794 1726882622.87815: variable 'interface' from source: set_fact 15794 1726882622.87912: variable 'interface' from source: set_fact 15794 1726882622.87915: variable 'interface' from source: set_fact 15794 1726882622.87999: variable 'interface' from source: set_fact 15794 1726882622.88136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882622.88360: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882622.88412: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882622.88464: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882622.88561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882622.88576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882622.88613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882622.88654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882622.88696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882622.88781: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882622.89102: variable 'network_connections' from source: play vars 15794 1726882622.89113: variable 'interface' from source: set_fact 15794 1726882622.89196: variable 'interface' from source: set_fact 15794 1726882622.89221: variable 'interface' from source: set_fact 15794 1726882622.89293: variable 'interface' from source: set_fact 15794 1726882622.89456: variable '__network_packages_default_wireless' from source: role '' defaults 15794 1726882622.89460: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882622.89809: variable 'network_connections' from source: play vars 15794 1726882622.89815: variable 'interface' from source: set_fact 15794 1726882622.89888: variable 'interface' from source: set_fact 15794 1726882622.89895: variable 'interface' from source: set_fact 15794 1726882622.89970: variable 'interface' from source: set_fact 15794 1726882622.89996: variable '__network_packages_default_team' from source: role '' defaults 15794 1726882622.90090: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882622.90467: variable 'network_connections' from source: play vars 15794 1726882622.90471: variable 'interface' from source: set_fact 15794 1726882622.90560: variable 'interface' from source: set_fact 15794 1726882622.90564: variable 'interface' from source: set_fact 15794 1726882622.90743: variable 'interface' from source: set_fact 15794 1726882622.90751: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882622.90815: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882622.90822: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882622.90899: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882622.91278: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15794 1726882622.91933: variable 'network_connections' from source: play vars 15794 1726882622.91942: variable 'interface' from source: set_fact 15794 1726882622.92047: variable 'interface' from source: set_fact 15794 1726882622.92051: variable 'interface' from source: set_fact 15794 1726882622.92113: variable 'interface' from source: set_fact 15794 1726882622.92246: variable 'ansible_distribution' from source: facts 15794 1726882622.92250: variable '__network_rh_distros' from source: role '' defaults 15794 1726882622.92252: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.92255: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15794 1726882622.92378: variable 'ansible_distribution' from source: facts 15794 1726882622.92442: variable '__network_rh_distros' from source: role '' defaults 15794 1726882622.92448: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.92451: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15794 1726882622.92665: variable 'ansible_distribution' from source: facts 15794 1726882622.92668: variable '__network_rh_distros' from source: role '' defaults 15794 1726882622.92671: variable 'ansible_distribution_major_version' from source: facts 15794 1726882622.92673: variable 'network_provider' from source: set_fact 15794 1726882622.92699: variable 'omit' from source: magic vars 15794 1726882622.92732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882622.92771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882622.92787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882622.92807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882622.92819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882622.92947: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882622.92951: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.92953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.92974: Set connection var ansible_connection to ssh 15794 1726882622.92989: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882622.92994: Set connection var ansible_pipelining to False 15794 1726882622.93003: Set connection var ansible_shell_executable to /bin/sh 15794 1726882622.93006: Set connection var ansible_shell_type to sh 15794 1726882622.93018: Set connection var ansible_timeout to 10 15794 1726882622.93054: variable 'ansible_shell_executable' from source: unknown 15794 1726882622.93058: variable 'ansible_connection' from source: unknown 15794 1726882622.93061: variable 'ansible_module_compression' from source: unknown 15794 1726882622.93064: variable 'ansible_shell_type' from source: unknown 15794 1726882622.93069: variable 'ansible_shell_executable' from source: unknown 15794 1726882622.93073: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882622.93079: variable 'ansible_pipelining' from source: unknown 15794 1726882622.93085: variable 'ansible_timeout' from source: unknown 15794 1726882622.93111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882622.93223: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882622.93237: variable 'omit' from source: magic vars 15794 1726882622.93316: starting attempt loop 15794 1726882622.93321: running the handler 15794 1726882622.93349: variable 'ansible_facts' from source: unknown 15794 1726882622.94557: _low_level_execute_command(): starting 15794 1726882622.94561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882622.95420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882622.95474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882622.95532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882622.97336: stdout chunk (state=3): >>>/root <<< 15794 1726882622.97474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882622.97600: stderr chunk (state=3): >>><<< 15794 1726882622.97604: stdout chunk (state=3): >>><<< 15794 1726882622.97748: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882622.97751: _low_level_execute_command(): starting 15794 1726882622.97755: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413 `" && echo ansible-tmp-1726882622.9763978-16516-117117425297413="` echo /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413 `" ) && sleep 0' 15794 1726882622.98607: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882622.98649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882622.98668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882622.98774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882622.98778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882622.98895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882622.98956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882622.99000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882622.99083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882623.01241: stdout chunk (state=3): >>>ansible-tmp-1726882622.9763978-16516-117117425297413=/root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413 <<< 15794 1726882623.01438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882623.01445: stdout chunk (state=3): >>><<< 15794 1726882623.01447: stderr chunk (state=3): >>><<< 15794 1726882623.01646: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882622.9763978-16516-117117425297413=/root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882623.01649: variable 'ansible_module_compression' from source: unknown 15794 1726882623.01653: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15794 1726882623.01656: ANSIBALLZ: Acquiring lock 15794 1726882623.01658: ANSIBALLZ: Lock acquired: 139758818400528 15794 1726882623.01660: ANSIBALLZ: Creating module 15794 1726882623.51528: ANSIBALLZ: Writing module into payload 15794 1726882623.52097: ANSIBALLZ: Writing module 15794 1726882623.52116: ANSIBALLZ: Renaming module 15794 1726882623.52128: ANSIBALLZ: Done creating module 15794 1726882623.52158: variable 'ansible_facts' from source: unknown 15794 1726882623.52559: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py 15794 1726882623.52931: Sending initial data 15794 1726882623.52977: Sent initial data (156 bytes) 15794 1726882623.54388: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882623.54481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882623.54512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882623.54529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882623.54687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882623.56548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15794 1726882623.56552: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882623.56588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882623.56705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpot2tdzwg /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py <<< 15794 1726882623.56782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpot2tdzwg" to remote "/root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py" <<< 15794 1726882623.56785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py" <<< 15794 1726882623.61745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882623.61749: stderr chunk (state=3): >>><<< 15794 1726882623.61752: stdout chunk (state=3): >>><<< 15794 1726882623.61754: done transferring module to remote 15794 1726882623.61757: _low_level_execute_command(): starting 15794 1726882623.61759: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/ /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py && sleep 0' 15794 1726882623.63053: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882623.63136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882623.63246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882623.63339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882623.65273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882623.65414: stderr chunk (state=3): >>><<< 15794 1726882623.65418: stdout chunk (state=3): >>><<< 15794 1726882623.65528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882623.65531: _low_level_execute_command(): starting 15794 1726882623.65541: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/AnsiballZ_systemd.py && sleep 0' 15794 1726882623.66915: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882623.66952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882623.66960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882623.67037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882623.67120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882624.00264: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "652", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ExecMainStartTimestampMonotonic": "15833159", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "652", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "1244794000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "network-pre.target system.slice systemd-journald.socket dbus.socket sysinit.target dbus-broker.service cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:32 EDT", "StateChangeTimestampMonotonic": "366878571", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:41 EDT", "InactiveExitTimestampMonotonic": "15833421", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ActiveEnterTimestampMonotonic": "15948855", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ConditionTimestampMonotonic": "15822215", "AssertTimestamp": "Fri 2024-09-20 21:27:41 EDT", "AssertTimestampMonotonic": "15822218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "9d67906d6bf74ff48c21207bf47afee4", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15794 1726882624.02359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882624.02449: stdout chunk (state=3): >>><<< 15794 1726882624.02452: stderr chunk (state=3): >>><<< 15794 1726882624.02601: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "652", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ExecMainStartTimestampMonotonic": "15833159", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "652", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "1244794000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "network-pre.target system.slice systemd-journald.socket dbus.socket sysinit.target dbus-broker.service cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:32 EDT", "StateChangeTimestampMonotonic": "366878571", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:41 EDT", "InactiveExitTimestampMonotonic": "15833421", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ActiveEnterTimestampMonotonic": "15948855", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ConditionTimestampMonotonic": "15822215", "AssertTimestamp": "Fri 2024-09-20 21:27:41 EDT", "AssertTimestampMonotonic": "15822218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "9d67906d6bf74ff48c21207bf47afee4", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882624.02861: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882624.02908: _low_level_execute_command(): starting 15794 1726882624.02936: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882622.9763978-16516-117117425297413/ > /dev/null 2>&1 && sleep 0' 15794 1726882624.04124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882624.04187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882624.04206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882624.04233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882624.04431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882624.06445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882624.06465: stdout chunk (state=3): >>><<< 15794 1726882624.06484: stderr chunk (state=3): >>><<< 15794 1726882624.06508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882624.06523: handler run complete 15794 1726882624.06631: attempt loop complete, returning result 15794 1726882624.06644: _execute() done 15794 1726882624.06652: dumping result to json 15794 1726882624.06842: done dumping result, returning 15794 1726882624.06846: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-94e5-e48f-000000000025] 15794 1726882624.06849: sending task result for task 0affe814-3a2d-94e5-e48f-000000000025 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882624.07316: no more pending results, returning what we have 15794 1726882624.07320: results queue empty 15794 1726882624.07321: checking for any_errors_fatal 15794 1726882624.07403: done checking for any_errors_fatal 15794 1726882624.07405: checking for max_fail_percentage 15794 1726882624.07408: done checking for max_fail_percentage 15794 1726882624.07409: checking to see if all hosts have failed and the running result is not ok 15794 1726882624.07410: done checking to see if all hosts have failed 15794 1726882624.07411: getting the remaining hosts for this loop 15794 1726882624.07413: done getting the remaining hosts for this loop 15794 1726882624.07418: getting the next task for host managed_node1 15794 1726882624.07426: done getting next task for host managed_node1 15794 1726882624.07431: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15794 1726882624.07437: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882624.07454: getting variables 15794 1726882624.07458: in VariableManager get_vars() 15794 1726882624.07500: Calling all_inventory to load vars for managed_node1 15794 1726882624.07725: Calling groups_inventory to load vars for managed_node1 15794 1726882624.07729: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882624.07743: Calling all_plugins_play to load vars for managed_node1 15794 1726882624.07748: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882624.07752: Calling groups_plugins_play to load vars for managed_node1 15794 1726882624.08329: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000025 15794 1726882624.08333: WORKER PROCESS EXITING 15794 1726882624.11124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882624.14182: done with get_vars() 15794 1726882624.14221: done getting variables 15794 1726882624.14291: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:04 -0400 (0:00:01.340) 0:00:21.701 ****** 15794 1726882624.14323: entering _queue_task() for managed_node1/service 15794 1726882624.14717: worker is 1 (out of 1 available) 15794 1726882624.14731: exiting _queue_task() for managed_node1/service 15794 1726882624.14761: done queuing things up, now waiting for results queue to drain 15794 1726882624.14763: waiting for pending results... 15794 1726882624.15093: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15794 1726882624.15304: in run() - task 0affe814-3a2d-94e5-e48f-000000000026 15794 1726882624.15333: variable 'ansible_search_path' from source: unknown 15794 1726882624.15357: variable 'ansible_search_path' from source: unknown 15794 1726882624.15448: calling self._execute() 15794 1726882624.15542: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882624.15566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882624.15612: variable 'omit' from source: magic vars 15794 1726882624.16642: variable 'ansible_distribution_major_version' from source: facts 15794 1726882624.16646: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882624.16940: variable 'network_provider' from source: set_fact 15794 1726882624.16954: Evaluated conditional (network_provider == "nm"): True 15794 1726882624.17223: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882624.17487: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882624.18066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882624.22114: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882624.22215: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882624.22270: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882624.22325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882624.22365: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882624.22486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882624.22538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882624.22578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882624.22645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882624.22671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882624.22777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882624.22800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882624.23151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882624.23154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882624.23158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882624.23161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882624.23164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882624.23167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882624.23320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882624.23347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882624.23568: variable 'network_connections' from source: play vars 15794 1726882624.23596: variable 'interface' from source: set_fact 15794 1726882624.23747: variable 'interface' from source: set_fact 15794 1726882624.23762: variable 'interface' from source: set_fact 15794 1726882624.23848: variable 'interface' from source: set_fact 15794 1726882624.24022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882624.24234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882624.24286: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882624.24329: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882624.24372: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882624.24494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882624.24530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882624.24628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882624.24669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882624.24729: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882624.25165: variable 'network_connections' from source: play vars 15794 1726882624.25169: variable 'interface' from source: set_fact 15794 1726882624.25182: variable 'interface' from source: set_fact 15794 1726882624.25195: variable 'interface' from source: set_fact 15794 1726882624.25278: variable 'interface' from source: set_fact 15794 1726882624.25339: Evaluated conditional (__network_wpa_supplicant_required): False 15794 1726882624.25350: when evaluation is False, skipping this task 15794 1726882624.25358: _execute() done 15794 1726882624.25374: dumping result to json 15794 1726882624.25382: done dumping result, returning 15794 1726882624.25397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-94e5-e48f-000000000026] 15794 1726882624.25409: sending task result for task 0affe814-3a2d-94e5-e48f-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15794 1726882624.25595: no more pending results, returning what we have 15794 1726882624.25599: results queue empty 15794 1726882624.25600: checking for any_errors_fatal 15794 1726882624.25624: done checking for any_errors_fatal 15794 1726882624.25625: checking for max_fail_percentage 15794 1726882624.25627: done checking for max_fail_percentage 15794 1726882624.25628: checking to see if all hosts have failed and the running result is not ok 15794 1726882624.25628: done checking to see if all hosts have failed 15794 1726882624.25629: getting the remaining hosts for this loop 15794 1726882624.25631: done getting the remaining hosts for this loop 15794 1726882624.25638: getting the next task for host managed_node1 15794 1726882624.25761: done getting next task for host managed_node1 15794 1726882624.25766: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15794 1726882624.25768: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882624.25790: getting variables 15794 1726882624.25792: in VariableManager get_vars() 15794 1726882624.25832: Calling all_inventory to load vars for managed_node1 15794 1726882624.25954: Calling groups_inventory to load vars for managed_node1 15794 1726882624.25958: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882624.25971: Calling all_plugins_play to load vars for managed_node1 15794 1726882624.25975: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882624.25982: Calling groups_plugins_play to load vars for managed_node1 15794 1726882624.26503: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000026 15794 1726882624.26507: WORKER PROCESS EXITING 15794 1726882624.30839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882624.34122: done with get_vars() 15794 1726882624.34187: done getting variables 15794 1726882624.34280: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:04 -0400 (0:00:00.199) 0:00:21.901 ****** 15794 1726882624.34316: entering _queue_task() for managed_node1/service 15794 1726882624.34754: worker is 1 (out of 1 available) 15794 1726882624.34769: exiting _queue_task() for managed_node1/service 15794 1726882624.34785: done queuing things up, now waiting for results queue to drain 15794 1726882624.34787: waiting for pending results... 15794 1726882624.35117: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15794 1726882624.35541: in run() - task 0affe814-3a2d-94e5-e48f-000000000027 15794 1726882624.35545: variable 'ansible_search_path' from source: unknown 15794 1726882624.35547: variable 'ansible_search_path' from source: unknown 15794 1726882624.35550: calling self._execute() 15794 1726882624.35553: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882624.35556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882624.35559: variable 'omit' from source: magic vars 15794 1726882624.35950: variable 'ansible_distribution_major_version' from source: facts 15794 1726882624.35963: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882624.36143: variable 'network_provider' from source: set_fact 15794 1726882624.36151: Evaluated conditional (network_provider == "initscripts"): False 15794 1726882624.36154: when evaluation is False, skipping this task 15794 1726882624.36157: _execute() done 15794 1726882624.36161: dumping result to json 15794 1726882624.36166: done dumping result, returning 15794 1726882624.36175: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-94e5-e48f-000000000027] 15794 1726882624.36186: sending task result for task 0affe814-3a2d-94e5-e48f-000000000027 15794 1726882624.36300: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000027 15794 1726882624.36303: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882624.36385: no more pending results, returning what we have 15794 1726882624.36390: results queue empty 15794 1726882624.36391: checking for any_errors_fatal 15794 1726882624.36402: done checking for any_errors_fatal 15794 1726882624.36403: checking for max_fail_percentage 15794 1726882624.36405: done checking for max_fail_percentage 15794 1726882624.36406: checking to see if all hosts have failed and the running result is not ok 15794 1726882624.36407: done checking to see if all hosts have failed 15794 1726882624.36408: getting the remaining hosts for this loop 15794 1726882624.36410: done getting the remaining hosts for this loop 15794 1726882624.36415: getting the next task for host managed_node1 15794 1726882624.36424: done getting next task for host managed_node1 15794 1726882624.36432: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15794 1726882624.36541: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882624.36560: getting variables 15794 1726882624.36563: in VariableManager get_vars() 15794 1726882624.36607: Calling all_inventory to load vars for managed_node1 15794 1726882624.36611: Calling groups_inventory to load vars for managed_node1 15794 1726882624.36614: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882624.36626: Calling all_plugins_play to load vars for managed_node1 15794 1726882624.36630: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882624.36763: Calling groups_plugins_play to load vars for managed_node1 15794 1726882624.39832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882624.45058: done with get_vars() 15794 1726882624.45156: done getting variables 15794 1726882624.45240: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:04 -0400 (0:00:00.109) 0:00:22.010 ****** 15794 1726882624.45282: entering _queue_task() for managed_node1/copy 15794 1726882624.46014: worker is 1 (out of 1 available) 15794 1726882624.46030: exiting _queue_task() for managed_node1/copy 15794 1726882624.46103: done queuing things up, now waiting for results queue to drain 15794 1726882624.46106: waiting for pending results... 15794 1726882624.46640: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15794 1726882624.46652: in run() - task 0affe814-3a2d-94e5-e48f-000000000028 15794 1726882624.46656: variable 'ansible_search_path' from source: unknown 15794 1726882624.46660: variable 'ansible_search_path' from source: unknown 15794 1726882624.46662: calling self._execute() 15794 1726882624.46667: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882624.46670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882624.46685: variable 'omit' from source: magic vars 15794 1726882624.47341: variable 'ansible_distribution_major_version' from source: facts 15794 1726882624.47346: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882624.47350: variable 'network_provider' from source: set_fact 15794 1726882624.47355: Evaluated conditional (network_provider == "initscripts"): False 15794 1726882624.47358: when evaluation is False, skipping this task 15794 1726882624.47363: _execute() done 15794 1726882624.47366: dumping result to json 15794 1726882624.47372: done dumping result, returning 15794 1726882624.47382: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-94e5-e48f-000000000028] 15794 1726882624.47390: sending task result for task 0affe814-3a2d-94e5-e48f-000000000028 15794 1726882624.47507: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000028 15794 1726882624.47510: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15794 1726882624.47582: no more pending results, returning what we have 15794 1726882624.47587: results queue empty 15794 1726882624.47589: checking for any_errors_fatal 15794 1726882624.47596: done checking for any_errors_fatal 15794 1726882624.47597: checking for max_fail_percentage 15794 1726882624.47599: done checking for max_fail_percentage 15794 1726882624.47601: checking to see if all hosts have failed and the running result is not ok 15794 1726882624.47602: done checking to see if all hosts have failed 15794 1726882624.47603: getting the remaining hosts for this loop 15794 1726882624.47605: done getting the remaining hosts for this loop 15794 1726882624.47610: getting the next task for host managed_node1 15794 1726882624.47622: done getting next task for host managed_node1 15794 1726882624.47628: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15794 1726882624.47631: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882624.47648: getting variables 15794 1726882624.47650: in VariableManager get_vars() 15794 1726882624.47696: Calling all_inventory to load vars for managed_node1 15794 1726882624.47699: Calling groups_inventory to load vars for managed_node1 15794 1726882624.47702: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882624.47716: Calling all_plugins_play to load vars for managed_node1 15794 1726882624.47719: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882624.47723: Calling groups_plugins_play to load vars for managed_node1 15794 1726882624.50230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882624.53347: done with get_vars() 15794 1726882624.53394: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:04 -0400 (0:00:00.082) 0:00:22.092 ****** 15794 1726882624.53494: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15794 1726882624.53501: Creating lock for fedora.linux_system_roles.network_connections 15794 1726882624.53905: worker is 1 (out of 1 available) 15794 1726882624.53917: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15794 1726882624.53931: done queuing things up, now waiting for results queue to drain 15794 1726882624.53935: waiting for pending results... 15794 1726882624.54455: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15794 1726882624.54462: in run() - task 0affe814-3a2d-94e5-e48f-000000000029 15794 1726882624.54465: variable 'ansible_search_path' from source: unknown 15794 1726882624.54468: variable 'ansible_search_path' from source: unknown 15794 1726882624.54471: calling self._execute() 15794 1726882624.54944: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882624.54948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882624.54952: variable 'omit' from source: magic vars 15794 1726882624.55535: variable 'ansible_distribution_major_version' from source: facts 15794 1726882624.55739: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882624.55743: variable 'omit' from source: magic vars 15794 1726882624.55746: variable 'omit' from source: magic vars 15794 1726882624.56171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882624.61683: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882624.61975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882624.62022: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882624.62171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882624.62206: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882624.62472: variable 'network_provider' from source: set_fact 15794 1726882624.62829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882624.62881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882624.62917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882624.63176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882624.63197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882624.63336: variable 'omit' from source: magic vars 15794 1726882624.63692: variable 'omit' from source: magic vars 15794 1726882624.63882: variable 'network_connections' from source: play vars 15794 1726882624.63901: variable 'interface' from source: set_fact 15794 1726882624.64100: variable 'interface' from source: set_fact 15794 1726882624.64108: variable 'interface' from source: set_fact 15794 1726882624.64440: variable 'interface' from source: set_fact 15794 1726882624.64939: variable 'omit' from source: magic vars 15794 1726882624.64943: variable '__lsr_ansible_managed' from source: task vars 15794 1726882624.64946: variable '__lsr_ansible_managed' from source: task vars 15794 1726882624.65228: Loaded config def from plugin (lookup/template) 15794 1726882624.65236: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15794 1726882624.65268: File lookup term: get_ansible_managed.j2 15794 1726882624.65272: variable 'ansible_search_path' from source: unknown 15794 1726882624.65278: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15794 1726882624.65298: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15794 1726882624.65321: variable 'ansible_search_path' from source: unknown 15794 1726882624.85412: variable 'ansible_managed' from source: unknown 15794 1726882624.85929: variable 'omit' from source: magic vars 15794 1726882624.85966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882624.86001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882624.86170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882624.86174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882624.86177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882624.86210: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882624.86214: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882624.86219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882624.86569: Set connection var ansible_connection to ssh 15794 1726882624.86579: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882624.86592: Set connection var ansible_pipelining to False 15794 1726882624.86603: Set connection var ansible_shell_executable to /bin/sh 15794 1726882624.86609: Set connection var ansible_shell_type to sh 15794 1726882624.86615: Set connection var ansible_timeout to 10 15794 1726882624.86650: variable 'ansible_shell_executable' from source: unknown 15794 1726882624.86653: variable 'ansible_connection' from source: unknown 15794 1726882624.86777: variable 'ansible_module_compression' from source: unknown 15794 1726882624.86785: variable 'ansible_shell_type' from source: unknown 15794 1726882624.86789: variable 'ansible_shell_executable' from source: unknown 15794 1726882624.86938: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882624.86943: variable 'ansible_pipelining' from source: unknown 15794 1726882624.86946: variable 'ansible_timeout' from source: unknown 15794 1726882624.86949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882624.87242: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882624.87255: variable 'omit' from source: magic vars 15794 1726882624.87328: starting attempt loop 15794 1726882624.87332: running the handler 15794 1726882624.87378: _low_level_execute_command(): starting 15794 1726882624.87382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882624.88695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882624.88752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882624.88765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882624.88781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882624.88821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882624.88825: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882624.88833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882624.89011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882624.89069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882624.89368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882624.91268: stdout chunk (state=3): >>>/root <<< 15794 1726882624.91466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882624.91473: stdout chunk (state=3): >>><<< 15794 1726882624.91483: stderr chunk (state=3): >>><<< 15794 1726882624.91505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882624.91519: _low_level_execute_command(): starting 15794 1726882624.91526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654 `" && echo ansible-tmp-1726882624.9150574-16606-86884885577654="` echo /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654 `" ) && sleep 0' 15794 1726882624.92841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882624.92844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882624.93081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882624.93408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882624.93449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882624.95551: stdout chunk (state=3): >>>ansible-tmp-1726882624.9150574-16606-86884885577654=/root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654 <<< 15794 1726882624.95740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882624.95769: stdout chunk (state=3): >>><<< 15794 1726882624.95772: stderr chunk (state=3): >>><<< 15794 1726882624.95791: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882624.9150574-16606-86884885577654=/root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882624.95941: variable 'ansible_module_compression' from source: unknown 15794 1726882624.95995: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15794 1726882624.96047: ANSIBALLZ: Acquiring lock 15794 1726882624.96056: ANSIBALLZ: Lock acquired: 139758812442080 15794 1726882624.96066: ANSIBALLZ: Creating module 15794 1726882625.58660: ANSIBALLZ: Writing module into payload 15794 1726882625.59656: ANSIBALLZ: Writing module 15794 1726882625.59688: ANSIBALLZ: Renaming module 15794 1726882625.59694: ANSIBALLZ: Done creating module 15794 1726882625.59723: variable 'ansible_facts' from source: unknown 15794 1726882625.60052: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py 15794 1726882625.60489: Sending initial data 15794 1726882625.60492: Sent initial data (167 bytes) 15794 1726882625.61751: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882625.62029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882625.62056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882625.62163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882625.63902: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882625.63963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882625.64062: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp914fnimx /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py <<< 15794 1726882625.64066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py" <<< 15794 1726882625.64091: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp914fnimx" to remote "/root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py" <<< 15794 1726882625.69247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882625.69252: stdout chunk (state=3): >>><<< 15794 1726882625.69255: stderr chunk (state=3): >>><<< 15794 1726882625.69258: done transferring module to remote 15794 1726882625.69261: _low_level_execute_command(): starting 15794 1726882625.69264: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/ /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py && sleep 0' 15794 1726882625.70741: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882625.70876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882625.70994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882625.71022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882625.71106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882625.73113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882625.73117: stdout chunk (state=3): >>><<< 15794 1726882625.73119: stderr chunk (state=3): >>><<< 15794 1726882625.73155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882625.73158: _low_level_execute_command(): starting 15794 1726882625.73165: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/AnsiballZ_network_connections.py && sleep 0' 15794 1726882625.74368: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882625.74501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882625.74691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882625.74750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882626.08729: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15794 1726882626.11053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882626.11057: stdout chunk (state=3): >>><<< 15794 1726882626.11059: stderr chunk (state=3): >>><<< 15794 1726882626.11227: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882626.11231: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882626.11332: _low_level_execute_command(): starting 15794 1726882626.11341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882624.9150574-16606-86884885577654/ > /dev/null 2>&1 && sleep 0' 15794 1726882626.13137: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882626.13209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882626.13223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882626.13349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882626.13422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882626.13768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882626.15895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882626.15900: stdout chunk (state=3): >>><<< 15794 1726882626.15903: stderr chunk (state=3): >>><<< 15794 1726882626.15907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882626.15910: handler run complete 15794 1726882626.15913: attempt loop complete, returning result 15794 1726882626.15915: _execute() done 15794 1726882626.16006: dumping result to json 15794 1726882626.16021: done dumping result, returning 15794 1726882626.16040: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-94e5-e48f-000000000029] 15794 1726882626.16122: sending task result for task 0affe814-3a2d-94e5-e48f-000000000029 15794 1726882626.16527: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000029 15794 1726882626.16531: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b (not-active) 15794 1726882626.16683: no more pending results, returning what we have 15794 1726882626.16687: results queue empty 15794 1726882626.16688: checking for any_errors_fatal 15794 1726882626.16696: done checking for any_errors_fatal 15794 1726882626.16697: checking for max_fail_percentage 15794 1726882626.16699: done checking for max_fail_percentage 15794 1726882626.16700: checking to see if all hosts have failed and the running result is not ok 15794 1726882626.16701: done checking to see if all hosts have failed 15794 1726882626.16702: getting the remaining hosts for this loop 15794 1726882626.16704: done getting the remaining hosts for this loop 15794 1726882626.16708: getting the next task for host managed_node1 15794 1726882626.16715: done getting next task for host managed_node1 15794 1726882626.16719: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15794 1726882626.16722: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882626.16732: getting variables 15794 1726882626.17120: in VariableManager get_vars() 15794 1726882626.17170: Calling all_inventory to load vars for managed_node1 15794 1726882626.17240: Calling groups_inventory to load vars for managed_node1 15794 1726882626.17244: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882626.17256: Calling all_plugins_play to load vars for managed_node1 15794 1726882626.17260: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882626.17266: Calling groups_plugins_play to load vars for managed_node1 15794 1726882626.21786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882626.30260: done with get_vars() 15794 1726882626.30545: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:06 -0400 (0:00:01.772) 0:00:23.865 ****** 15794 1726882626.30775: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15794 1726882626.30777: Creating lock for fedora.linux_system_roles.network_state 15794 1726882626.31539: worker is 1 (out of 1 available) 15794 1726882626.31554: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15794 1726882626.31568: done queuing things up, now waiting for results queue to drain 15794 1726882626.31570: waiting for pending results... 15794 1726882626.32476: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15794 1726882626.33209: in run() - task 0affe814-3a2d-94e5-e48f-00000000002a 15794 1726882626.33213: variable 'ansible_search_path' from source: unknown 15794 1726882626.33216: variable 'ansible_search_path' from source: unknown 15794 1726882626.33219: calling self._execute() 15794 1726882626.33641: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.33666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.33776: variable 'omit' from source: magic vars 15794 1726882626.35042: variable 'ansible_distribution_major_version' from source: facts 15794 1726882626.35061: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882626.35409: variable 'network_state' from source: role '' defaults 15794 1726882626.35476: Evaluated conditional (network_state != {}): False 15794 1726882626.35489: when evaluation is False, skipping this task 15794 1726882626.35498: _execute() done 15794 1726882626.35512: dumping result to json 15794 1726882626.35522: done dumping result, returning 15794 1726882626.35615: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-94e5-e48f-00000000002a] 15794 1726882626.35619: sending task result for task 0affe814-3a2d-94e5-e48f-00000000002a 15794 1726882626.35917: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000002a 15794 1726882626.35921: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882626.35981: no more pending results, returning what we have 15794 1726882626.35986: results queue empty 15794 1726882626.35987: checking for any_errors_fatal 15794 1726882626.35999: done checking for any_errors_fatal 15794 1726882626.36000: checking for max_fail_percentage 15794 1726882626.36003: done checking for max_fail_percentage 15794 1726882626.36004: checking to see if all hosts have failed and the running result is not ok 15794 1726882626.36005: done checking to see if all hosts have failed 15794 1726882626.36006: getting the remaining hosts for this loop 15794 1726882626.36009: done getting the remaining hosts for this loop 15794 1726882626.36013: getting the next task for host managed_node1 15794 1726882626.36023: done getting next task for host managed_node1 15794 1726882626.36028: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15794 1726882626.36031: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882626.36051: getting variables 15794 1726882626.36053: in VariableManager get_vars() 15794 1726882626.36104: Calling all_inventory to load vars for managed_node1 15794 1726882626.36108: Calling groups_inventory to load vars for managed_node1 15794 1726882626.36111: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882626.36124: Calling all_plugins_play to load vars for managed_node1 15794 1726882626.36127: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882626.36131: Calling groups_plugins_play to load vars for managed_node1 15794 1726882626.42683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882626.48682: done with get_vars() 15794 1726882626.48746: done getting variables 15794 1726882626.48830: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:06 -0400 (0:00:00.180) 0:00:24.046 ****** 15794 1726882626.48870: entering _queue_task() for managed_node1/debug 15794 1726882626.49383: worker is 1 (out of 1 available) 15794 1726882626.49397: exiting _queue_task() for managed_node1/debug 15794 1726882626.49409: done queuing things up, now waiting for results queue to drain 15794 1726882626.49411: waiting for pending results... 15794 1726882626.50254: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15794 1726882626.50637: in run() - task 0affe814-3a2d-94e5-e48f-00000000002b 15794 1726882626.50642: variable 'ansible_search_path' from source: unknown 15794 1726882626.50645: variable 'ansible_search_path' from source: unknown 15794 1726882626.50647: calling self._execute() 15794 1726882626.51075: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.51080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.51084: variable 'omit' from source: magic vars 15794 1726882626.51945: variable 'ansible_distribution_major_version' from source: facts 15794 1726882626.52164: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882626.52170: variable 'omit' from source: magic vars 15794 1726882626.52173: variable 'omit' from source: magic vars 15794 1726882626.52293: variable 'omit' from source: magic vars 15794 1726882626.52394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882626.52540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882626.52543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882626.52611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882626.52839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882626.52845: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882626.52848: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.52851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.53001: Set connection var ansible_connection to ssh 15794 1726882626.53065: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882626.53088: Set connection var ansible_pipelining to False 15794 1726882626.53186: Set connection var ansible_shell_executable to /bin/sh 15794 1726882626.53190: Set connection var ansible_shell_type to sh 15794 1726882626.53272: Set connection var ansible_timeout to 10 15794 1726882626.53276: variable 'ansible_shell_executable' from source: unknown 15794 1726882626.53278: variable 'ansible_connection' from source: unknown 15794 1726882626.53281: variable 'ansible_module_compression' from source: unknown 15794 1726882626.53283: variable 'ansible_shell_type' from source: unknown 15794 1726882626.53285: variable 'ansible_shell_executable' from source: unknown 15794 1726882626.53342: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.53344: variable 'ansible_pipelining' from source: unknown 15794 1726882626.53353: variable 'ansible_timeout' from source: unknown 15794 1726882626.53364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.53838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882626.53844: variable 'omit' from source: magic vars 15794 1726882626.53857: starting attempt loop 15794 1726882626.53926: running the handler 15794 1726882626.54319: variable '__network_connections_result' from source: set_fact 15794 1726882626.54453: handler run complete 15794 1726882626.54493: attempt loop complete, returning result 15794 1726882626.54592: _execute() done 15794 1726882626.54596: dumping result to json 15794 1726882626.54598: done dumping result, returning 15794 1726882626.54601: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-94e5-e48f-00000000002b] 15794 1726882626.54604: sending task result for task 0affe814-3a2d-94e5-e48f-00000000002b 15794 1726882626.54941: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000002b 15794 1726882626.54944: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b (not-active)" ] } 15794 1726882626.55044: no more pending results, returning what we have 15794 1726882626.55048: results queue empty 15794 1726882626.55050: checking for any_errors_fatal 15794 1726882626.55057: done checking for any_errors_fatal 15794 1726882626.55058: checking for max_fail_percentage 15794 1726882626.55061: done checking for max_fail_percentage 15794 1726882626.55062: checking to see if all hosts have failed and the running result is not ok 15794 1726882626.55063: done checking to see if all hosts have failed 15794 1726882626.55064: getting the remaining hosts for this loop 15794 1726882626.55066: done getting the remaining hosts for this loop 15794 1726882626.55071: getting the next task for host managed_node1 15794 1726882626.55198: done getting next task for host managed_node1 15794 1726882626.55203: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15794 1726882626.55206: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882626.55219: getting variables 15794 1726882626.55221: in VariableManager get_vars() 15794 1726882626.55268: Calling all_inventory to load vars for managed_node1 15794 1726882626.55271: Calling groups_inventory to load vars for managed_node1 15794 1726882626.55274: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882626.55286: Calling all_plugins_play to load vars for managed_node1 15794 1726882626.55291: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882626.55635: Calling groups_plugins_play to load vars for managed_node1 15794 1726882626.61943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882626.71655: done with get_vars() 15794 1726882626.71707: done getting variables 15794 1726882626.72007: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:06 -0400 (0:00:00.231) 0:00:24.279 ****** 15794 1726882626.72198: entering _queue_task() for managed_node1/debug 15794 1726882626.73395: worker is 1 (out of 1 available) 15794 1726882626.73409: exiting _queue_task() for managed_node1/debug 15794 1726882626.73424: done queuing things up, now waiting for results queue to drain 15794 1726882626.73426: waiting for pending results... 15794 1726882626.74168: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15794 1726882626.74450: in run() - task 0affe814-3a2d-94e5-e48f-00000000002c 15794 1726882626.74504: variable 'ansible_search_path' from source: unknown 15794 1726882626.74508: variable 'ansible_search_path' from source: unknown 15794 1726882626.74589: calling self._execute() 15794 1726882626.74830: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.75136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.75142: variable 'omit' from source: magic vars 15794 1726882626.76444: variable 'ansible_distribution_major_version' from source: facts 15794 1726882626.76449: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882626.76452: variable 'omit' from source: magic vars 15794 1726882626.76455: variable 'omit' from source: magic vars 15794 1726882626.76457: variable 'omit' from source: magic vars 15794 1726882626.76881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882626.76924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882626.77253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882626.77279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882626.77297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882626.77338: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882626.77752: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.77758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.77940: Set connection var ansible_connection to ssh 15794 1726882626.77944: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882626.77947: Set connection var ansible_pipelining to False 15794 1726882626.77950: Set connection var ansible_shell_executable to /bin/sh 15794 1726882626.77952: Set connection var ansible_shell_type to sh 15794 1726882626.77955: Set connection var ansible_timeout to 10 15794 1726882626.78207: variable 'ansible_shell_executable' from source: unknown 15794 1726882626.78211: variable 'ansible_connection' from source: unknown 15794 1726882626.78215: variable 'ansible_module_compression' from source: unknown 15794 1726882626.78218: variable 'ansible_shell_type' from source: unknown 15794 1726882626.78221: variable 'ansible_shell_executable' from source: unknown 15794 1726882626.78227: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.78232: variable 'ansible_pipelining' from source: unknown 15794 1726882626.78236: variable 'ansible_timeout' from source: unknown 15794 1726882626.78243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.78951: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882626.78976: variable 'omit' from source: magic vars 15794 1726882626.78980: starting attempt loop 15794 1726882626.78983: running the handler 15794 1726882626.79142: variable '__network_connections_result' from source: set_fact 15794 1726882626.79519: variable '__network_connections_result' from source: set_fact 15794 1726882626.79842: handler run complete 15794 1726882626.80175: attempt loop complete, returning result 15794 1726882626.80182: _execute() done 15794 1726882626.80185: dumping result to json 15794 1726882626.80188: done dumping result, returning 15794 1726882626.80201: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-94e5-e48f-00000000002c] 15794 1726882626.80207: sending task result for task 0affe814-3a2d-94e5-e48f-00000000002c ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 666af3c8-45ed-476e-bdc6-601fe256e49b (not-active)" ] } } 15794 1726882626.80789: no more pending results, returning what we have 15794 1726882626.80916: results queue empty 15794 1726882626.80918: checking for any_errors_fatal 15794 1726882626.80928: done checking for any_errors_fatal 15794 1726882626.80930: checking for max_fail_percentage 15794 1726882626.80931: done checking for max_fail_percentage 15794 1726882626.80933: checking to see if all hosts have failed and the running result is not ok 15794 1726882626.80936: done checking to see if all hosts have failed 15794 1726882626.80937: getting the remaining hosts for this loop 15794 1726882626.80939: done getting the remaining hosts for this loop 15794 1726882626.80944: getting the next task for host managed_node1 15794 1726882626.80950: done getting next task for host managed_node1 15794 1726882626.80955: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15794 1726882626.80958: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882626.80970: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000002c 15794 1726882626.80973: WORKER PROCESS EXITING 15794 1726882626.80982: getting variables 15794 1726882626.80984: in VariableManager get_vars() 15794 1726882626.81142: Calling all_inventory to load vars for managed_node1 15794 1726882626.81146: Calling groups_inventory to load vars for managed_node1 15794 1726882626.81150: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882626.81160: Calling all_plugins_play to load vars for managed_node1 15794 1726882626.81164: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882626.81168: Calling groups_plugins_play to load vars for managed_node1 15794 1726882626.88105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882626.96426: done with get_vars() 15794 1726882626.96706: done getting variables 15794 1726882626.96782: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:06 -0400 (0:00:00.247) 0:00:24.527 ****** 15794 1726882626.96941: entering _queue_task() for managed_node1/debug 15794 1726882626.97665: worker is 1 (out of 1 available) 15794 1726882626.97679: exiting _queue_task() for managed_node1/debug 15794 1726882626.97693: done queuing things up, now waiting for results queue to drain 15794 1726882626.97695: waiting for pending results... 15794 1726882626.98548: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15794 1726882626.98994: in run() - task 0affe814-3a2d-94e5-e48f-00000000002d 15794 1726882626.99013: variable 'ansible_search_path' from source: unknown 15794 1726882626.99018: variable 'ansible_search_path' from source: unknown 15794 1726882626.99142: calling self._execute() 15794 1726882626.99399: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882626.99682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882626.99693: variable 'omit' from source: magic vars 15794 1726882627.01021: variable 'ansible_distribution_major_version' from source: facts 15794 1726882627.01037: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882627.01581: variable 'network_state' from source: role '' defaults 15794 1726882627.02056: Evaluated conditional (network_state != {}): False 15794 1726882627.02060: when evaluation is False, skipping this task 15794 1726882627.02063: _execute() done 15794 1726882627.02066: dumping result to json 15794 1726882627.02191: done dumping result, returning 15794 1726882627.02196: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-94e5-e48f-00000000002d] 15794 1726882627.02199: sending task result for task 0affe814-3a2d-94e5-e48f-00000000002d 15794 1726882627.02281: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000002d 15794 1726882627.02285: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15794 1726882627.02348: no more pending results, returning what we have 15794 1726882627.02354: results queue empty 15794 1726882627.02355: checking for any_errors_fatal 15794 1726882627.02368: done checking for any_errors_fatal 15794 1726882627.02369: checking for max_fail_percentage 15794 1726882627.02372: done checking for max_fail_percentage 15794 1726882627.02373: checking to see if all hosts have failed and the running result is not ok 15794 1726882627.02374: done checking to see if all hosts have failed 15794 1726882627.02375: getting the remaining hosts for this loop 15794 1726882627.02378: done getting the remaining hosts for this loop 15794 1726882627.02384: getting the next task for host managed_node1 15794 1726882627.02393: done getting next task for host managed_node1 15794 1726882627.02398: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15794 1726882627.02402: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882627.02420: getting variables 15794 1726882627.02422: in VariableManager get_vars() 15794 1726882627.02471: Calling all_inventory to load vars for managed_node1 15794 1726882627.02475: Calling groups_inventory to load vars for managed_node1 15794 1726882627.02478: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882627.02492: Calling all_plugins_play to load vars for managed_node1 15794 1726882627.02497: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882627.02501: Calling groups_plugins_play to load vars for managed_node1 15794 1726882627.09717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882627.15807: done with get_vars() 15794 1726882627.16077: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:07 -0400 (0:00:00.192) 0:00:24.719 ****** 15794 1726882627.16198: entering _queue_task() for managed_node1/ping 15794 1726882627.16200: Creating lock for ping 15794 1726882627.17172: worker is 1 (out of 1 available) 15794 1726882627.17186: exiting _queue_task() for managed_node1/ping 15794 1726882627.17199: done queuing things up, now waiting for results queue to drain 15794 1726882627.17201: waiting for pending results... 15794 1726882627.18325: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15794 1726882627.18392: in run() - task 0affe814-3a2d-94e5-e48f-00000000002e 15794 1726882627.18746: variable 'ansible_search_path' from source: unknown 15794 1726882627.18751: variable 'ansible_search_path' from source: unknown 15794 1726882627.18755: calling self._execute() 15794 1726882627.19047: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882627.19291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882627.19295: variable 'omit' from source: magic vars 15794 1726882627.20528: variable 'ansible_distribution_major_version' from source: facts 15794 1726882627.20929: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882627.20932: variable 'omit' from source: magic vars 15794 1726882627.20937: variable 'omit' from source: magic vars 15794 1726882627.20955: variable 'omit' from source: magic vars 15794 1726882627.21087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882627.21184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882627.21387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882627.21464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882627.21485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882627.21641: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882627.21651: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882627.21659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882627.22235: Set connection var ansible_connection to ssh 15794 1726882627.22240: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882627.22243: Set connection var ansible_pipelining to False 15794 1726882627.22245: Set connection var ansible_shell_executable to /bin/sh 15794 1726882627.22247: Set connection var ansible_shell_type to sh 15794 1726882627.22249: Set connection var ansible_timeout to 10 15794 1726882627.22252: variable 'ansible_shell_executable' from source: unknown 15794 1726882627.22254: variable 'ansible_connection' from source: unknown 15794 1726882627.22256: variable 'ansible_module_compression' from source: unknown 15794 1726882627.22259: variable 'ansible_shell_type' from source: unknown 15794 1726882627.22262: variable 'ansible_shell_executable' from source: unknown 15794 1726882627.22264: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882627.22268: variable 'ansible_pipelining' from source: unknown 15794 1726882627.22271: variable 'ansible_timeout' from source: unknown 15794 1726882627.22643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882627.23051: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882627.23072: variable 'omit' from source: magic vars 15794 1726882627.23085: starting attempt loop 15794 1726882627.23093: running the handler 15794 1726882627.23121: _low_level_execute_command(): starting 15794 1726882627.23164: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882627.24681: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882627.24753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882627.24996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882627.25055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882627.26831: stdout chunk (state=3): >>>/root <<< 15794 1726882627.27044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882627.27047: stdout chunk (state=3): >>><<< 15794 1726882627.27050: stderr chunk (state=3): >>><<< 15794 1726882627.27253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882627.27257: _low_level_execute_command(): starting 15794 1726882627.27259: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785 `" && echo ansible-tmp-1726882627.27141-16679-40519001111785="` echo /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785 `" ) && sleep 0' 15794 1726882627.28553: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882627.28740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882627.28768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882627.28856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882627.30860: stdout chunk (state=3): >>>ansible-tmp-1726882627.27141-16679-40519001111785=/root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785 <<< 15794 1726882627.31062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882627.31077: stderr chunk (state=3): >>><<< 15794 1726882627.31160: stdout chunk (state=3): >>><<< 15794 1726882627.31178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882627.27141-16679-40519001111785=/root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882627.31307: variable 'ansible_module_compression' from source: unknown 15794 1726882627.31351: ANSIBALLZ: Using lock for ping 15794 1726882627.31365: ANSIBALLZ: Acquiring lock 15794 1726882627.31374: ANSIBALLZ: Lock acquired: 139758812549568 15794 1726882627.31383: ANSIBALLZ: Creating module 15794 1726882627.80388: ANSIBALLZ: Writing module into payload 15794 1726882627.80633: ANSIBALLZ: Writing module 15794 1726882627.80639: ANSIBALLZ: Renaming module 15794 1726882627.80641: ANSIBALLZ: Done creating module 15794 1726882627.80644: variable 'ansible_facts' from source: unknown 15794 1726882627.80733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py 15794 1726882627.81161: Sending initial data 15794 1726882627.81164: Sent initial data (150 bytes) 15794 1726882627.82112: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882627.82222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882627.82320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882627.82406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882627.82436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882627.84156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882627.84223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882627.84280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpntygn9xr /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py <<< 15794 1726882627.84284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py" <<< 15794 1726882627.84983: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpntygn9xr" to remote "/root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py" <<< 15794 1726882627.87455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882627.87915: stderr chunk (state=3): >>><<< 15794 1726882627.87919: stdout chunk (state=3): >>><<< 15794 1726882627.87922: done transferring module to remote 15794 1726882627.87925: _low_level_execute_command(): starting 15794 1726882627.87928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/ /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py && sleep 0' 15794 1726882627.89758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882627.89943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882627.90042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882627.92066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882627.92070: stdout chunk (state=3): >>><<< 15794 1726882627.92078: stderr chunk (state=3): >>><<< 15794 1726882627.92161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882627.92169: _low_level_execute_command(): starting 15794 1726882627.92172: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/AnsiballZ_ping.py && sleep 0' 15794 1726882627.93370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882627.93429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882627.93504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882627.93507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882627.93691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882628.10866: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15794 1726882628.12441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882628.12446: stdout chunk (state=3): >>><<< 15794 1726882628.12448: stderr chunk (state=3): >>><<< 15794 1726882628.12451: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882628.12453: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882628.12456: _low_level_execute_command(): starting 15794 1726882628.12465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882627.27141-16679-40519001111785/ > /dev/null 2>&1 && sleep 0' 15794 1726882628.13212: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882628.13238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882628.13324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882628.13379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882628.13413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882628.13472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882628.13545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882628.15590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882628.15593: stdout chunk (state=3): >>><<< 15794 1726882628.15596: stderr chunk (state=3): >>><<< 15794 1726882628.15861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882628.15865: handler run complete 15794 1726882628.15867: attempt loop complete, returning result 15794 1726882628.15870: _execute() done 15794 1726882628.15872: dumping result to json 15794 1726882628.15875: done dumping result, returning 15794 1726882628.15877: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-94e5-e48f-00000000002e] 15794 1726882628.15882: sending task result for task 0affe814-3a2d-94e5-e48f-00000000002e 15794 1726882628.15960: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000002e 15794 1726882628.15963: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15794 1726882628.16044: no more pending results, returning what we have 15794 1726882628.16048: results queue empty 15794 1726882628.16049: checking for any_errors_fatal 15794 1726882628.16061: done checking for any_errors_fatal 15794 1726882628.16063: checking for max_fail_percentage 15794 1726882628.16065: done checking for max_fail_percentage 15794 1726882628.16066: checking to see if all hosts have failed and the running result is not ok 15794 1726882628.16067: done checking to see if all hosts have failed 15794 1726882628.16068: getting the remaining hosts for this loop 15794 1726882628.16071: done getting the remaining hosts for this loop 15794 1726882628.16077: getting the next task for host managed_node1 15794 1726882628.16090: done getting next task for host managed_node1 15794 1726882628.16093: ^ task is: TASK: meta (role_complete) 15794 1726882628.16096: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882628.16109: getting variables 15794 1726882628.16111: in VariableManager get_vars() 15794 1726882628.16417: Calling all_inventory to load vars for managed_node1 15794 1726882628.16421: Calling groups_inventory to load vars for managed_node1 15794 1726882628.16423: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.16506: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.16512: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.16517: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.19085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.29017: done with get_vars() 15794 1726882628.29069: done getting variables 15794 1726882628.29160: done queuing things up, now waiting for results queue to drain 15794 1726882628.29163: results queue empty 15794 1726882628.29164: checking for any_errors_fatal 15794 1726882628.29168: done checking for any_errors_fatal 15794 1726882628.29169: checking for max_fail_percentage 15794 1726882628.29170: done checking for max_fail_percentage 15794 1726882628.29171: checking to see if all hosts have failed and the running result is not ok 15794 1726882628.29172: done checking to see if all hosts have failed 15794 1726882628.29173: getting the remaining hosts for this loop 15794 1726882628.29174: done getting the remaining hosts for this loop 15794 1726882628.29178: getting the next task for host managed_node1 15794 1726882628.29182: done getting next task for host managed_node1 15794 1726882628.29185: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 15794 1726882628.29187: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882628.29190: getting variables 15794 1726882628.29191: in VariableManager get_vars() 15794 1726882628.29206: Calling all_inventory to load vars for managed_node1 15794 1726882628.29209: Calling groups_inventory to load vars for managed_node1 15794 1726882628.29212: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.29218: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.29222: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.29225: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.31208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.34046: done with get_vars() 15794 1726882628.34080: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 21:37:08 -0400 (0:00:01.179) 0:00:25.899 ****** 15794 1726882628.34173: entering _queue_task() for managed_node1/include_tasks 15794 1726882628.34553: worker is 1 (out of 1 available) 15794 1726882628.34567: exiting _queue_task() for managed_node1/include_tasks 15794 1726882628.34746: done queuing things up, now waiting for results queue to drain 15794 1726882628.34749: waiting for pending results... 15794 1726882628.35062: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 15794 1726882628.35069: in run() - task 0affe814-3a2d-94e5-e48f-000000000030 15794 1726882628.35073: variable 'ansible_search_path' from source: unknown 15794 1726882628.35102: calling self._execute() 15794 1726882628.35274: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.35281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.35285: variable 'omit' from source: magic vars 15794 1726882628.35720: variable 'ansible_distribution_major_version' from source: facts 15794 1726882628.35736: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882628.35753: _execute() done 15794 1726882628.35757: dumping result to json 15794 1726882628.35820: done dumping result, returning 15794 1726882628.35824: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0affe814-3a2d-94e5-e48f-000000000030] 15794 1726882628.35827: sending task result for task 0affe814-3a2d-94e5-e48f-000000000030 15794 1726882628.35907: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000030 15794 1726882628.35911: WORKER PROCESS EXITING 15794 1726882628.35947: no more pending results, returning what we have 15794 1726882628.35956: in VariableManager get_vars() 15794 1726882628.36005: Calling all_inventory to load vars for managed_node1 15794 1726882628.36008: Calling groups_inventory to load vars for managed_node1 15794 1726882628.36011: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.36027: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.36031: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.36037: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.38621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.41884: done with get_vars() 15794 1726882628.41916: variable 'ansible_search_path' from source: unknown 15794 1726882628.41935: we have included files to process 15794 1726882628.41936: generating all_blocks data 15794 1726882628.41939: done generating all_blocks data 15794 1726882628.41948: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 15794 1726882628.41950: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 15794 1726882628.41954: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 15794 1726882628.42394: done processing included file 15794 1726882628.42397: iterating over new_blocks loaded from include file 15794 1726882628.42399: in VariableManager get_vars() 15794 1726882628.42425: done with get_vars() 15794 1726882628.42428: filtering new block on tags 15794 1726882628.42455: done filtering new block on tags 15794 1726882628.42458: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node1 15794 1726882628.42466: extending task lists for all hosts with included blocks 15794 1726882628.42513: done extending task lists 15794 1726882628.42515: done processing included files 15794 1726882628.42516: results queue empty 15794 1726882628.42516: checking for any_errors_fatal 15794 1726882628.42518: done checking for any_errors_fatal 15794 1726882628.42521: checking for max_fail_percentage 15794 1726882628.42522: done checking for max_fail_percentage 15794 1726882628.42523: checking to see if all hosts have failed and the running result is not ok 15794 1726882628.42524: done checking to see if all hosts have failed 15794 1726882628.42525: getting the remaining hosts for this loop 15794 1726882628.42527: done getting the remaining hosts for this loop 15794 1726882628.42530: getting the next task for host managed_node1 15794 1726882628.42538: done getting next task for host managed_node1 15794 1726882628.42541: ^ task is: TASK: Assert that warnings is empty 15794 1726882628.42543: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882628.42547: getting variables 15794 1726882628.42548: in VariableManager get_vars() 15794 1726882628.42562: Calling all_inventory to load vars for managed_node1 15794 1726882628.42564: Calling groups_inventory to load vars for managed_node1 15794 1726882628.42567: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.42574: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.42577: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.42581: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.45723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.51315: done with get_vars() 15794 1726882628.51352: done getting variables 15794 1726882628.51968: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 21:37:08 -0400 (0:00:00.178) 0:00:26.078 ****** 15794 1726882628.52008: entering _queue_task() for managed_node1/assert 15794 1726882628.53046: worker is 1 (out of 1 available) 15794 1726882628.53057: exiting _queue_task() for managed_node1/assert 15794 1726882628.53069: done queuing things up, now waiting for results queue to drain 15794 1726882628.53070: waiting for pending results... 15794 1726882628.54254: running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty 15794 1726882628.54341: in run() - task 0affe814-3a2d-94e5-e48f-000000000304 15794 1726882628.54345: variable 'ansible_search_path' from source: unknown 15794 1726882628.54347: variable 'ansible_search_path' from source: unknown 15794 1726882628.54350: calling self._execute() 15794 1726882628.54352: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.54354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.54358: variable 'omit' from source: magic vars 15794 1726882628.54788: variable 'ansible_distribution_major_version' from source: facts 15794 1726882628.54811: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882628.54827: variable 'omit' from source: magic vars 15794 1726882628.54877: variable 'omit' from source: magic vars 15794 1726882628.54928: variable 'omit' from source: magic vars 15794 1726882628.54985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882628.55040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882628.55072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882628.55101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882628.55118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882628.55163: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882628.55173: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.55186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.55360: Set connection var ansible_connection to ssh 15794 1726882628.55363: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882628.55365: Set connection var ansible_pipelining to False 15794 1726882628.55367: Set connection var ansible_shell_executable to /bin/sh 15794 1726882628.55370: Set connection var ansible_shell_type to sh 15794 1726882628.55373: Set connection var ansible_timeout to 10 15794 1726882628.55412: variable 'ansible_shell_executable' from source: unknown 15794 1726882628.55421: variable 'ansible_connection' from source: unknown 15794 1726882628.55429: variable 'ansible_module_compression' from source: unknown 15794 1726882628.55439: variable 'ansible_shell_type' from source: unknown 15794 1726882628.55447: variable 'ansible_shell_executable' from source: unknown 15794 1726882628.55455: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.55577: variable 'ansible_pipelining' from source: unknown 15794 1726882628.55584: variable 'ansible_timeout' from source: unknown 15794 1726882628.55587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.55711: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882628.55732: variable 'omit' from source: magic vars 15794 1726882628.55747: starting attempt loop 15794 1726882628.55754: running the handler 15794 1726882628.56037: variable '__network_connections_result' from source: set_fact 15794 1726882628.56059: Evaluated conditional ('warnings' not in __network_connections_result): True 15794 1726882628.56072: handler run complete 15794 1726882628.56103: attempt loop complete, returning result 15794 1726882628.56343: _execute() done 15794 1726882628.56347: dumping result to json 15794 1726882628.56350: done dumping result, returning 15794 1726882628.56354: done running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty [0affe814-3a2d-94e5-e48f-000000000304] 15794 1726882628.56356: sending task result for task 0affe814-3a2d-94e5-e48f-000000000304 15794 1726882628.56447: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000304 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15794 1726882628.56516: no more pending results, returning what we have 15794 1726882628.56522: results queue empty 15794 1726882628.56523: checking for any_errors_fatal 15794 1726882628.56525: done checking for any_errors_fatal 15794 1726882628.56526: checking for max_fail_percentage 15794 1726882628.56529: done checking for max_fail_percentage 15794 1726882628.56530: checking to see if all hosts have failed and the running result is not ok 15794 1726882628.56531: done checking to see if all hosts have failed 15794 1726882628.56532: getting the remaining hosts for this loop 15794 1726882628.56537: done getting the remaining hosts for this loop 15794 1726882628.56542: getting the next task for host managed_node1 15794 1726882628.56550: done getting next task for host managed_node1 15794 1726882628.56554: ^ task is: TASK: Assert that there is output in stderr 15794 1726882628.56558: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882628.56563: getting variables 15794 1726882628.56565: in VariableManager get_vars() 15794 1726882628.56611: Calling all_inventory to load vars for managed_node1 15794 1726882628.56618: Calling groups_inventory to load vars for managed_node1 15794 1726882628.56622: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.57139: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.57147: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.57153: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.58143: WORKER PROCESS EXITING 15794 1726882628.63154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.71262: done with get_vars() 15794 1726882628.71427: done getting variables 15794 1726882628.71567: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 21:37:08 -0400 (0:00:00.195) 0:00:26.273 ****** 15794 1726882628.71733: entering _queue_task() for managed_node1/assert 15794 1726882628.72666: worker is 1 (out of 1 available) 15794 1726882628.72677: exiting _queue_task() for managed_node1/assert 15794 1726882628.72689: done queuing things up, now waiting for results queue to drain 15794 1726882628.72691: waiting for pending results... 15794 1726882628.73114: running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr 15794 1726882628.73452: in run() - task 0affe814-3a2d-94e5-e48f-000000000305 15794 1726882628.73573: variable 'ansible_search_path' from source: unknown 15794 1726882628.73577: variable 'ansible_search_path' from source: unknown 15794 1726882628.73629: calling self._execute() 15794 1726882628.73858: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.73867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.73882: variable 'omit' from source: magic vars 15794 1726882628.74881: variable 'ansible_distribution_major_version' from source: facts 15794 1726882628.74890: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882628.74898: variable 'omit' from source: magic vars 15794 1726882628.75080: variable 'omit' from source: magic vars 15794 1726882628.75123: variable 'omit' from source: magic vars 15794 1726882628.75293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882628.75339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882628.75464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882628.75741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882628.75746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882628.75749: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882628.75752: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.75755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.75872: Set connection var ansible_connection to ssh 15794 1726882628.75883: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882628.75892: Set connection var ansible_pipelining to False 15794 1726882628.75900: Set connection var ansible_shell_executable to /bin/sh 15794 1726882628.76028: Set connection var ansible_shell_type to sh 15794 1726882628.76043: Set connection var ansible_timeout to 10 15794 1726882628.76089: variable 'ansible_shell_executable' from source: unknown 15794 1726882628.76093: variable 'ansible_connection' from source: unknown 15794 1726882628.76096: variable 'ansible_module_compression' from source: unknown 15794 1726882628.76099: variable 'ansible_shell_type' from source: unknown 15794 1726882628.76102: variable 'ansible_shell_executable' from source: unknown 15794 1726882628.76104: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882628.76107: variable 'ansible_pipelining' from source: unknown 15794 1726882628.76109: variable 'ansible_timeout' from source: unknown 15794 1726882628.76111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882628.76635: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882628.76742: variable 'omit' from source: magic vars 15794 1726882628.76746: starting attempt loop 15794 1726882628.76748: running the handler 15794 1726882628.77004: variable '__network_connections_result' from source: set_fact 15794 1726882628.77026: Evaluated conditional ('stderr' in __network_connections_result): True 15794 1726882628.77033: handler run complete 15794 1726882628.77053: attempt loop complete, returning result 15794 1726882628.77056: _execute() done 15794 1726882628.77059: dumping result to json 15794 1726882628.77068: done dumping result, returning 15794 1726882628.77071: done running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr [0affe814-3a2d-94e5-e48f-000000000305] 15794 1726882628.77082: sending task result for task 0affe814-3a2d-94e5-e48f-000000000305 15794 1726882628.77471: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000305 15794 1726882628.77475: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15794 1726882628.77531: no more pending results, returning what we have 15794 1726882628.77538: results queue empty 15794 1726882628.77539: checking for any_errors_fatal 15794 1726882628.77553: done checking for any_errors_fatal 15794 1726882628.77555: checking for max_fail_percentage 15794 1726882628.77557: done checking for max_fail_percentage 15794 1726882628.77558: checking to see if all hosts have failed and the running result is not ok 15794 1726882628.77559: done checking to see if all hosts have failed 15794 1726882628.77560: getting the remaining hosts for this loop 15794 1726882628.77562: done getting the remaining hosts for this loop 15794 1726882628.77567: getting the next task for host managed_node1 15794 1726882628.77578: done getting next task for host managed_node1 15794 1726882628.77581: ^ task is: TASK: meta (flush_handlers) 15794 1726882628.77584: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882628.77589: getting variables 15794 1726882628.77591: in VariableManager get_vars() 15794 1726882628.77860: Calling all_inventory to load vars for managed_node1 15794 1726882628.77864: Calling groups_inventory to load vars for managed_node1 15794 1726882628.77868: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.77879: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.77882: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.77887: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.83142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.89293: done with get_vars() 15794 1726882628.89351: done getting variables 15794 1726882628.89454: in VariableManager get_vars() 15794 1726882628.89471: Calling all_inventory to load vars for managed_node1 15794 1726882628.89474: Calling groups_inventory to load vars for managed_node1 15794 1726882628.89477: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.89486: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.89490: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.89494: Calling groups_plugins_play to load vars for managed_node1 15794 1726882628.92588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882628.99038: done with get_vars() 15794 1726882628.99090: done queuing things up, now waiting for results queue to drain 15794 1726882628.99093: results queue empty 15794 1726882628.99094: checking for any_errors_fatal 15794 1726882628.99099: done checking for any_errors_fatal 15794 1726882628.99100: checking for max_fail_percentage 15794 1726882628.99101: done checking for max_fail_percentage 15794 1726882628.99102: checking to see if all hosts have failed and the running result is not ok 15794 1726882628.99103: done checking to see if all hosts have failed 15794 1726882628.99104: getting the remaining hosts for this loop 15794 1726882628.99111: done getting the remaining hosts for this loop 15794 1726882628.99115: getting the next task for host managed_node1 15794 1726882628.99120: done getting next task for host managed_node1 15794 1726882628.99121: ^ task is: TASK: meta (flush_handlers) 15794 1726882628.99123: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882628.99127: getting variables 15794 1726882628.99128: in VariableManager get_vars() 15794 1726882628.99146: Calling all_inventory to load vars for managed_node1 15794 1726882628.99150: Calling groups_inventory to load vars for managed_node1 15794 1726882628.99153: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882628.99161: Calling all_plugins_play to load vars for managed_node1 15794 1726882628.99164: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882628.99168: Calling groups_plugins_play to load vars for managed_node1 15794 1726882629.03173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882629.08699: done with get_vars() 15794 1726882629.08809: done getting variables 15794 1726882629.09027: in VariableManager get_vars() 15794 1726882629.09047: Calling all_inventory to load vars for managed_node1 15794 1726882629.09050: Calling groups_inventory to load vars for managed_node1 15794 1726882629.09053: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882629.09059: Calling all_plugins_play to load vars for managed_node1 15794 1726882629.09062: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882629.09066: Calling groups_plugins_play to load vars for managed_node1 15794 1726882629.11714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882629.14628: done with get_vars() 15794 1726882629.14674: done queuing things up, now waiting for results queue to drain 15794 1726882629.14680: results queue empty 15794 1726882629.14681: checking for any_errors_fatal 15794 1726882629.14683: done checking for any_errors_fatal 15794 1726882629.14686: checking for max_fail_percentage 15794 1726882629.14688: done checking for max_fail_percentage 15794 1726882629.14689: checking to see if all hosts have failed and the running result is not ok 15794 1726882629.14690: done checking to see if all hosts have failed 15794 1726882629.14691: getting the remaining hosts for this loop 15794 1726882629.14692: done getting the remaining hosts for this loop 15794 1726882629.14696: getting the next task for host managed_node1 15794 1726882629.14701: done getting next task for host managed_node1 15794 1726882629.14702: ^ task is: None 15794 1726882629.14704: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882629.14705: done queuing things up, now waiting for results queue to drain 15794 1726882629.14706: results queue empty 15794 1726882629.14707: checking for any_errors_fatal 15794 1726882629.14708: done checking for any_errors_fatal 15794 1726882629.14709: checking for max_fail_percentage 15794 1726882629.14710: done checking for max_fail_percentage 15794 1726882629.14711: checking to see if all hosts have failed and the running result is not ok 15794 1726882629.14712: done checking to see if all hosts have failed 15794 1726882629.14714: getting the next task for host managed_node1 15794 1726882629.14716: done getting next task for host managed_node1 15794 1726882629.14717: ^ task is: None 15794 1726882629.14719: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882629.14768: in VariableManager get_vars() 15794 1726882629.14791: done with get_vars() 15794 1726882629.14804: in VariableManager get_vars() 15794 1726882629.14818: done with get_vars() 15794 1726882629.14824: variable 'omit' from source: magic vars 15794 1726882629.14866: in VariableManager get_vars() 15794 1726882629.14881: done with get_vars() 15794 1726882629.14908: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 15794 1726882629.15143: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882629.15168: getting the remaining hosts for this loop 15794 1726882629.15169: done getting the remaining hosts for this loop 15794 1726882629.15172: getting the next task for host managed_node1 15794 1726882629.15176: done getting next task for host managed_node1 15794 1726882629.15181: ^ task is: TASK: Gathering Facts 15794 1726882629.15183: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882629.15185: getting variables 15794 1726882629.15187: in VariableManager get_vars() 15794 1726882629.15197: Calling all_inventory to load vars for managed_node1 15794 1726882629.15200: Calling groups_inventory to load vars for managed_node1 15794 1726882629.15203: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882629.15209: Calling all_plugins_play to load vars for managed_node1 15794 1726882629.15212: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882629.15216: Calling groups_plugins_play to load vars for managed_node1 15794 1726882629.17223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882629.20087: done with get_vars() 15794 1726882629.20123: done getting variables 15794 1726882629.20187: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 21:37:09 -0400 (0:00:00.486) 0:00:26.760 ****** 15794 1726882629.20217: entering _queue_task() for managed_node1/gather_facts 15794 1726882629.20579: worker is 1 (out of 1 available) 15794 1726882629.20591: exiting _queue_task() for managed_node1/gather_facts 15794 1726882629.20603: done queuing things up, now waiting for results queue to drain 15794 1726882629.20604: waiting for pending results... 15794 1726882629.20904: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882629.21060: in run() - task 0affe814-3a2d-94e5-e48f-000000000316 15794 1726882629.21064: variable 'ansible_search_path' from source: unknown 15794 1726882629.21082: calling self._execute() 15794 1726882629.21189: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882629.21201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882629.21440: variable 'omit' from source: magic vars 15794 1726882629.21662: variable 'ansible_distribution_major_version' from source: facts 15794 1726882629.21685: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882629.21699: variable 'omit' from source: magic vars 15794 1726882629.21740: variable 'omit' from source: magic vars 15794 1726882629.21794: variable 'omit' from source: magic vars 15794 1726882629.21846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882629.21896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882629.21927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882629.21997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882629.22001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882629.22016: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882629.22027: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882629.22039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882629.22169: Set connection var ansible_connection to ssh 15794 1726882629.22185: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882629.22199: Set connection var ansible_pipelining to False 15794 1726882629.22324: Set connection var ansible_shell_executable to /bin/sh 15794 1726882629.22327: Set connection var ansible_shell_type to sh 15794 1726882629.22331: Set connection var ansible_timeout to 10 15794 1726882629.22336: variable 'ansible_shell_executable' from source: unknown 15794 1726882629.22338: variable 'ansible_connection' from source: unknown 15794 1726882629.22341: variable 'ansible_module_compression' from source: unknown 15794 1726882629.22344: variable 'ansible_shell_type' from source: unknown 15794 1726882629.22346: variable 'ansible_shell_executable' from source: unknown 15794 1726882629.22348: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882629.22350: variable 'ansible_pipelining' from source: unknown 15794 1726882629.22352: variable 'ansible_timeout' from source: unknown 15794 1726882629.22354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882629.22550: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882629.22568: variable 'omit' from source: magic vars 15794 1726882629.22580: starting attempt loop 15794 1726882629.22588: running the handler 15794 1726882629.22613: variable 'ansible_facts' from source: unknown 15794 1726882629.22642: _low_level_execute_command(): starting 15794 1726882629.22662: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882629.23455: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882629.23476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882629.23538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882629.23586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882629.23607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882629.23624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882629.23724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882629.25530: stdout chunk (state=3): >>>/root <<< 15794 1726882629.25913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882629.25917: stdout chunk (state=3): >>><<< 15794 1726882629.25919: stderr chunk (state=3): >>><<< 15794 1726882629.25924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882629.25926: _low_level_execute_command(): starting 15794 1726882629.25929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001 `" && echo ansible-tmp-1726882629.2579157-16766-64746356836001="` echo /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001 `" ) && sleep 0' 15794 1726882629.27093: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882629.27161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882629.27328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882629.27381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882629.27404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882629.27495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882629.27586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882629.29661: stdout chunk (state=3): >>>ansible-tmp-1726882629.2579157-16766-64746356836001=/root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001 <<< 15794 1726882629.29805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882629.29955: stderr chunk (state=3): >>><<< 15794 1726882629.29959: stdout chunk (state=3): >>><<< 15794 1726882629.30048: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882629.2579157-16766-64746356836001=/root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882629.30087: variable 'ansible_module_compression' from source: unknown 15794 1726882629.30181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882629.30320: variable 'ansible_facts' from source: unknown 15794 1726882629.30550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py 15794 1726882629.30795: Sending initial data 15794 1726882629.30799: Sent initial data (153 bytes) 15794 1726882629.31481: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882629.31555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882629.31596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882629.31613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882629.31666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882629.31722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882629.33370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15794 1726882629.33415: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882629.33489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882629.33787: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpnhx_8uu3 /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py <<< 15794 1726882629.33790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py" <<< 15794 1726882629.33860: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpnhx_8uu3" to remote "/root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py" <<< 15794 1726882629.36736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882629.36799: stderr chunk (state=3): >>><<< 15794 1726882629.36819: stdout chunk (state=3): >>><<< 15794 1726882629.36984: done transferring module to remote 15794 1726882629.36987: _low_level_execute_command(): starting 15794 1726882629.36990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/ /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py && sleep 0' 15794 1726882629.37918: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882629.37955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882629.38048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882629.40418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882629.40433: stdout chunk (state=3): >>><<< 15794 1726882629.40461: stderr chunk (state=3): >>><<< 15794 1726882629.40480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882629.40577: _low_level_execute_command(): starting 15794 1726882629.40584: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/AnsiballZ_setup.py && sleep 0' 15794 1726882629.41457: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882629.41565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882629.41626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882629.41629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882629.41701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882630.11956: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2878, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 839, "free": 2878}, "nocache": {"free": 3483, "used": 234}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null<<< 15794 1726882630.11965: stdout chunk (state=3): >>>, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 584, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205296128, "block_size": 4096, "block_total": 64483404, "block_available": 61329418, "block_used": 3153986, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.4560546875, "5m": 0.4296875, "15m": 0.21435546875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "10", "epoch": "1726882630", "epoch_int": "1726882630", "date": "2024-09-20", "time": "21:37:10", "iso8601_micro": "2024-09-21T01:37:10.071729Z", "iso8601": "2024-09-21T01:37:10Z", "iso8601_basic": "20240920T213710071729", "iso8601_basic_short": "20240920T213710", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f550:87be:f736:b32a", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::f550:87be:f736:b32a"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::bb10:9a17:6b35:7604", "fe80::f550:87be:f736:b32a"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882630.14027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882630.14341: stderr chunk (state=3): >>>Shared connection to 10.31.10.217 closed. <<< 15794 1726882630.14539: stderr chunk (state=3): >>><<< 15794 1726882630.14543: stdout chunk (state=3): >>><<< 15794 1726882630.14546: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2878, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 839, "free": 2878}, "nocache": {"free": 3483, "used": 234}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 584, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205296128, "block_size": 4096, "block_total": 64483404, "block_available": 61329418, "block_used": 3153986, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.4560546875, "5m": 0.4296875, "15m": 0.21435546875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "10", "epoch": "1726882630", "epoch_int": "1726882630", "date": "2024-09-20", "time": "21:37:10", "iso8601_micro": "2024-09-21T01:37:10.071729Z", "iso8601": "2024-09-21T01:37:10Z", "iso8601_basic": "20240920T213710071729", "iso8601_basic_short": "20240920T213710", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f550:87be:f736:b32a", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::f550:87be:f736:b32a"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::bb10:9a17:6b35:7604", "fe80::f550:87be:f736:b32a"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882630.15330: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882630.15468: _low_level_execute_command(): starting 15794 1726882630.15481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882629.2579157-16766-64746356836001/ > /dev/null 2>&1 && sleep 0' 15794 1726882630.16860: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882630.16865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882630.17064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882630.17250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882630.19201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882630.19275: stderr chunk (state=3): >>><<< 15794 1726882630.19291: stdout chunk (state=3): >>><<< 15794 1726882630.19316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882630.19331: handler run complete 15794 1726882630.19537: variable 'ansible_facts' from source: unknown 15794 1726882630.19684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.20174: variable 'ansible_facts' from source: unknown 15794 1726882630.20439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.20500: attempt loop complete, returning result 15794 1726882630.20511: _execute() done 15794 1726882630.20520: dumping result to json 15794 1726882630.20561: done dumping result, returning 15794 1726882630.20574: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-000000000316] 15794 1726882630.20589: sending task result for task 0affe814-3a2d-94e5-e48f-000000000316 15794 1726882630.21242: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000316 15794 1726882630.21245: WORKER PROCESS EXITING ok: [managed_node1] 15794 1726882630.21771: no more pending results, returning what we have 15794 1726882630.21774: results queue empty 15794 1726882630.21775: checking for any_errors_fatal 15794 1726882630.21777: done checking for any_errors_fatal 15794 1726882630.21780: checking for max_fail_percentage 15794 1726882630.21781: done checking for max_fail_percentage 15794 1726882630.21782: checking to see if all hosts have failed and the running result is not ok 15794 1726882630.21783: done checking to see if all hosts have failed 15794 1726882630.21784: getting the remaining hosts for this loop 15794 1726882630.21786: done getting the remaining hosts for this loop 15794 1726882630.21790: getting the next task for host managed_node1 15794 1726882630.21796: done getting next task for host managed_node1 15794 1726882630.21798: ^ task is: TASK: meta (flush_handlers) 15794 1726882630.21800: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.21804: getting variables 15794 1726882630.21806: in VariableManager get_vars() 15794 1726882630.21829: Calling all_inventory to load vars for managed_node1 15794 1726882630.21832: Calling groups_inventory to load vars for managed_node1 15794 1726882630.21837: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.21848: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.21851: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.21855: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.24389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.27706: done with get_vars() 15794 1726882630.27747: done getting variables 15794 1726882630.27836: in VariableManager get_vars() 15794 1726882630.27849: Calling all_inventory to load vars for managed_node1 15794 1726882630.27852: Calling groups_inventory to load vars for managed_node1 15794 1726882630.27855: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.27861: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.27864: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.27868: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.30244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.33108: done with get_vars() 15794 1726882630.33149: done queuing things up, now waiting for results queue to drain 15794 1726882630.33152: results queue empty 15794 1726882630.33153: checking for any_errors_fatal 15794 1726882630.33157: done checking for any_errors_fatal 15794 1726882630.33158: checking for max_fail_percentage 15794 1726882630.33160: done checking for max_fail_percentage 15794 1726882630.33161: checking to see if all hosts have failed and the running result is not ok 15794 1726882630.33162: done checking to see if all hosts have failed 15794 1726882630.33163: getting the remaining hosts for this loop 15794 1726882630.33168: done getting the remaining hosts for this loop 15794 1726882630.33171: getting the next task for host managed_node1 15794 1726882630.33176: done getting next task for host managed_node1 15794 1726882630.33181: ^ task is: TASK: Show network_provider 15794 1726882630.33183: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.33186: getting variables 15794 1726882630.33187: in VariableManager get_vars() 15794 1726882630.33198: Calling all_inventory to load vars for managed_node1 15794 1726882630.33201: Calling groups_inventory to load vars for managed_node1 15794 1726882630.33204: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.33210: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.33213: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.33217: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.35576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.39276: done with get_vars() 15794 1726882630.39330: done getting variables 15794 1726882630.39408: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 21:37:10 -0400 (0:00:01.192) 0:00:27.952 ****** 15794 1726882630.39445: entering _queue_task() for managed_node1/debug 15794 1726882630.40051: worker is 1 (out of 1 available) 15794 1726882630.40065: exiting _queue_task() for managed_node1/debug 15794 1726882630.40079: done queuing things up, now waiting for results queue to drain 15794 1726882630.40080: waiting for pending results... 15794 1726882630.40477: running TaskExecutor() for managed_node1/TASK: Show network_provider 15794 1726882630.40602: in run() - task 0affe814-3a2d-94e5-e48f-000000000033 15794 1726882630.40642: variable 'ansible_search_path' from source: unknown 15794 1726882630.40692: calling self._execute() 15794 1726882630.40830: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882630.40849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882630.40868: variable 'omit' from source: magic vars 15794 1726882630.41357: variable 'ansible_distribution_major_version' from source: facts 15794 1726882630.41388: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882630.41439: variable 'omit' from source: magic vars 15794 1726882630.41459: variable 'omit' from source: magic vars 15794 1726882630.41506: variable 'omit' from source: magic vars 15794 1726882630.41557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882630.41609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882630.41641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882630.41669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882630.41687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882630.41766: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882630.41770: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882630.41831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882630.41915: Set connection var ansible_connection to ssh 15794 1726882630.41936: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882630.41950: Set connection var ansible_pipelining to False 15794 1726882630.41961: Set connection var ansible_shell_executable to /bin/sh 15794 1726882630.41968: Set connection var ansible_shell_type to sh 15794 1726882630.41983: Set connection var ansible_timeout to 10 15794 1726882630.42044: variable 'ansible_shell_executable' from source: unknown 15794 1726882630.42140: variable 'ansible_connection' from source: unknown 15794 1726882630.42143: variable 'ansible_module_compression' from source: unknown 15794 1726882630.42148: variable 'ansible_shell_type' from source: unknown 15794 1726882630.42150: variable 'ansible_shell_executable' from source: unknown 15794 1726882630.42153: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882630.42155: variable 'ansible_pipelining' from source: unknown 15794 1726882630.42157: variable 'ansible_timeout' from source: unknown 15794 1726882630.42159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882630.42276: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882630.42295: variable 'omit' from source: magic vars 15794 1726882630.42306: starting attempt loop 15794 1726882630.42338: running the handler 15794 1726882630.42373: variable 'network_provider' from source: set_fact 15794 1726882630.42475: variable 'network_provider' from source: set_fact 15794 1726882630.42498: handler run complete 15794 1726882630.42525: attempt loop complete, returning result 15794 1726882630.42603: _execute() done 15794 1726882630.42607: dumping result to json 15794 1726882630.42609: done dumping result, returning 15794 1726882630.42612: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0affe814-3a2d-94e5-e48f-000000000033] 15794 1726882630.42614: sending task result for task 0affe814-3a2d-94e5-e48f-000000000033 15794 1726882630.42691: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000033 15794 1726882630.42694: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 15794 1726882630.42767: no more pending results, returning what we have 15794 1726882630.42772: results queue empty 15794 1726882630.42773: checking for any_errors_fatal 15794 1726882630.42776: done checking for any_errors_fatal 15794 1726882630.42777: checking for max_fail_percentage 15794 1726882630.42779: done checking for max_fail_percentage 15794 1726882630.42780: checking to see if all hosts have failed and the running result is not ok 15794 1726882630.42781: done checking to see if all hosts have failed 15794 1726882630.42782: getting the remaining hosts for this loop 15794 1726882630.42785: done getting the remaining hosts for this loop 15794 1726882630.42789: getting the next task for host managed_node1 15794 1726882630.42798: done getting next task for host managed_node1 15794 1726882630.42801: ^ task is: TASK: meta (flush_handlers) 15794 1726882630.42803: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.42808: getting variables 15794 1726882630.42810: in VariableManager get_vars() 15794 1726882630.42846: Calling all_inventory to load vars for managed_node1 15794 1726882630.42849: Calling groups_inventory to load vars for managed_node1 15794 1726882630.42853: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.42866: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.42869: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.42873: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.45389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.48835: done with get_vars() 15794 1726882630.48895: done getting variables 15794 1726882630.49023: in VariableManager get_vars() 15794 1726882630.49040: Calling all_inventory to load vars for managed_node1 15794 1726882630.49043: Calling groups_inventory to load vars for managed_node1 15794 1726882630.49047: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.49053: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.49057: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.49061: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.51378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.54311: done with get_vars() 15794 1726882630.54403: done queuing things up, now waiting for results queue to drain 15794 1726882630.54406: results queue empty 15794 1726882630.54407: checking for any_errors_fatal 15794 1726882630.54410: done checking for any_errors_fatal 15794 1726882630.54412: checking for max_fail_percentage 15794 1726882630.54413: done checking for max_fail_percentage 15794 1726882630.54414: checking to see if all hosts have failed and the running result is not ok 15794 1726882630.54415: done checking to see if all hosts have failed 15794 1726882630.54416: getting the remaining hosts for this loop 15794 1726882630.54417: done getting the remaining hosts for this loop 15794 1726882630.54420: getting the next task for host managed_node1 15794 1726882630.54431: done getting next task for host managed_node1 15794 1726882630.54433: ^ task is: TASK: meta (flush_handlers) 15794 1726882630.54436: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.54439: getting variables 15794 1726882630.54441: in VariableManager get_vars() 15794 1726882630.54453: Calling all_inventory to load vars for managed_node1 15794 1726882630.54456: Calling groups_inventory to load vars for managed_node1 15794 1726882630.54460: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.54467: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.54471: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.54475: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.56625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.59598: done with get_vars() 15794 1726882630.59637: done getting variables 15794 1726882630.59709: in VariableManager get_vars() 15794 1726882630.59722: Calling all_inventory to load vars for managed_node1 15794 1726882630.59724: Calling groups_inventory to load vars for managed_node1 15794 1726882630.59743: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.59751: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.59755: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.59759: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.61964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.65149: done with get_vars() 15794 1726882630.65188: done queuing things up, now waiting for results queue to drain 15794 1726882630.65191: results queue empty 15794 1726882630.65192: checking for any_errors_fatal 15794 1726882630.65193: done checking for any_errors_fatal 15794 1726882630.65194: checking for max_fail_percentage 15794 1726882630.65196: done checking for max_fail_percentage 15794 1726882630.65196: checking to see if all hosts have failed and the running result is not ok 15794 1726882630.65197: done checking to see if all hosts have failed 15794 1726882630.65198: getting the remaining hosts for this loop 15794 1726882630.65199: done getting the remaining hosts for this loop 15794 1726882630.65202: getting the next task for host managed_node1 15794 1726882630.65206: done getting next task for host managed_node1 15794 1726882630.65207: ^ task is: None 15794 1726882630.65209: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.65210: done queuing things up, now waiting for results queue to drain 15794 1726882630.65211: results queue empty 15794 1726882630.65212: checking for any_errors_fatal 15794 1726882630.65213: done checking for any_errors_fatal 15794 1726882630.65214: checking for max_fail_percentage 15794 1726882630.65215: done checking for max_fail_percentage 15794 1726882630.65216: checking to see if all hosts have failed and the running result is not ok 15794 1726882630.65217: done checking to see if all hosts have failed 15794 1726882630.65218: getting the next task for host managed_node1 15794 1726882630.65220: done getting next task for host managed_node1 15794 1726882630.65221: ^ task is: None 15794 1726882630.65223: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.65261: in VariableManager get_vars() 15794 1726882630.65288: done with get_vars() 15794 1726882630.65295: in VariableManager get_vars() 15794 1726882630.65312: done with get_vars() 15794 1726882630.65318: variable 'omit' from source: magic vars 15794 1726882630.65497: variable 'profile' from source: play vars 15794 1726882630.65667: in VariableManager get_vars() 15794 1726882630.65684: done with get_vars() 15794 1726882630.65711: variable 'omit' from source: magic vars 15794 1726882630.65812: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15794 1726882630.67574: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882630.67598: getting the remaining hosts for this loop 15794 1726882630.67600: done getting the remaining hosts for this loop 15794 1726882630.67603: getting the next task for host managed_node1 15794 1726882630.67606: done getting next task for host managed_node1 15794 1726882630.67609: ^ task is: TASK: Gathering Facts 15794 1726882630.67611: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882630.67613: getting variables 15794 1726882630.67614: in VariableManager get_vars() 15794 1726882630.67627: Calling all_inventory to load vars for managed_node1 15794 1726882630.67630: Calling groups_inventory to load vars for managed_node1 15794 1726882630.67633: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882630.67643: Calling all_plugins_play to load vars for managed_node1 15794 1726882630.67647: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882630.67650: Calling groups_plugins_play to load vars for managed_node1 15794 1726882630.77263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882630.81931: done with get_vars() 15794 1726882630.81987: done getting variables 15794 1726882630.82048: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:37:10 -0400 (0:00:00.426) 0:00:28.378 ****** 15794 1726882630.82088: entering _queue_task() for managed_node1/gather_facts 15794 1726882630.82542: worker is 1 (out of 1 available) 15794 1726882630.82555: exiting _queue_task() for managed_node1/gather_facts 15794 1726882630.82568: done queuing things up, now waiting for results queue to drain 15794 1726882630.82569: waiting for pending results... 15794 1726882630.82838: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882630.82981: in run() - task 0affe814-3a2d-94e5-e48f-00000000032b 15794 1726882630.83018: variable 'ansible_search_path' from source: unknown 15794 1726882630.83128: calling self._execute() 15794 1726882630.83186: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882630.83202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882630.83218: variable 'omit' from source: magic vars 15794 1726882630.83728: variable 'ansible_distribution_major_version' from source: facts 15794 1726882630.83750: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882630.83762: variable 'omit' from source: magic vars 15794 1726882630.83813: variable 'omit' from source: magic vars 15794 1726882630.83865: variable 'omit' from source: magic vars 15794 1726882630.83927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882630.83977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882630.84030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882630.84050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882630.84104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882630.84117: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882630.84128: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882630.84141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882630.84279: Set connection var ansible_connection to ssh 15794 1726882630.84295: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882630.84321: Set connection var ansible_pipelining to False 15794 1726882630.84325: Set connection var ansible_shell_executable to /bin/sh 15794 1726882630.84360: Set connection var ansible_shell_type to sh 15794 1726882630.84364: Set connection var ansible_timeout to 10 15794 1726882630.84395: variable 'ansible_shell_executable' from source: unknown 15794 1726882630.84405: variable 'ansible_connection' from source: unknown 15794 1726882630.84413: variable 'ansible_module_compression' from source: unknown 15794 1726882630.84421: variable 'ansible_shell_type' from source: unknown 15794 1726882630.84442: variable 'ansible_shell_executable' from source: unknown 15794 1726882630.84469: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882630.84472: variable 'ansible_pipelining' from source: unknown 15794 1726882630.84475: variable 'ansible_timeout' from source: unknown 15794 1726882630.84478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882630.84739: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882630.84743: variable 'omit' from source: magic vars 15794 1726882630.84746: starting attempt loop 15794 1726882630.84748: running the handler 15794 1726882630.84766: variable 'ansible_facts' from source: unknown 15794 1726882630.84803: _low_level_execute_command(): starting 15794 1726882630.84869: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882630.85643: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882630.85660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882630.85717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882630.85732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882630.85827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882630.85847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882630.85863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882630.86052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882630.86145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882630.87938: stdout chunk (state=3): >>>/root <<< 15794 1726882630.88224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882630.88345: stderr chunk (state=3): >>><<< 15794 1726882630.88459: stdout chunk (state=3): >>><<< 15794 1726882630.88600: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882630.88604: _low_level_execute_command(): starting 15794 1726882630.88608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703 `" && echo ansible-tmp-1726882630.884878-16830-100278479249703="` echo /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703 `" ) && sleep 0' 15794 1726882630.90076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882630.90089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882630.90131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882630.90184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882630.90243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882630.90279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882630.92315: stdout chunk (state=3): >>>ansible-tmp-1726882630.884878-16830-100278479249703=/root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703 <<< 15794 1726882630.92458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882630.92531: stderr chunk (state=3): >>><<< 15794 1726882630.92565: stdout chunk (state=3): >>><<< 15794 1726882630.92847: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882630.884878-16830-100278479249703=/root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882630.92850: variable 'ansible_module_compression' from source: unknown 15794 1726882630.92853: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882630.92856: variable 'ansible_facts' from source: unknown 15794 1726882630.93297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py 15794 1726882630.93748: Sending initial data 15794 1726882630.93760: Sent initial data (153 bytes) 15794 1726882630.95268: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882630.95286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882630.95382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882630.95476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882630.95531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882630.97347: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882630.97351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882630.97379: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpkzdb9e1u /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py <<< 15794 1726882630.97386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py" <<< 15794 1726882630.97452: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpkzdb9e1u" to remote "/root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py" <<< 15794 1726882631.01273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882631.01461: stderr chunk (state=3): >>><<< 15794 1726882631.01470: stdout chunk (state=3): >>><<< 15794 1726882631.01497: done transferring module to remote 15794 1726882631.01511: _low_level_execute_command(): starting 15794 1726882631.01518: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/ /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py && sleep 0' 15794 1726882631.02995: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882631.03118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882631.03122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882631.03125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882631.03128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882631.03245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882631.03335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882631.05270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882631.05333: stderr chunk (state=3): >>><<< 15794 1726882631.05558: stdout chunk (state=3): >>><<< 15794 1726882631.05561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882631.05568: _low_level_execute_command(): starting 15794 1726882631.05570: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/AnsiballZ_setup.py && sleep 0' 15794 1726882631.06739: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882631.06790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882631.06806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882631.06873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882631.06894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882631.07107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882631.07111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882631.07130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882631.07255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882631.07389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882631.75413: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "11", "epoch": "1726882631", "epoch_int": "1726882631", "date": "2024-09-20", "time": "21:37:11", "iso8601_micro": "2024-09-21T01:37:11.381022Z", "iso8601": "2024-09-21T01:37:11Z", "iso8601_basic": "20240920T213711381022", "iso8601_basic_short": "20240920T213711", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.41943359375, "5m": 0.42236328125, "15m": 0.212890625}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["peerlsr27", "eth0", "lo", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f550:87be:f736:b32a", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::f550:87be:f736:b32a"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::bb10:9a17:6b35:7604", "fe80::f550:87be:f736:b32a"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2851, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 866, "free": 2851}, "nocache": {"free": 3456, "used": 261}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 585, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205287936, "block_size": 4096, "block_total": 64483404, "block_available": 61329416, "block_used": 3153988, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882631.77629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882631.77849: stdout chunk (state=3): >>><<< 15794 1726882631.77853: stderr chunk (state=3): >>><<< 15794 1726882631.77857: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "11", "epoch": "1726882631", "epoch_int": "1726882631", "date": "2024-09-20", "time": "21:37:11", "iso8601_micro": "2024-09-21T01:37:11.381022Z", "iso8601": "2024-09-21T01:37:11Z", "iso8601_basic": "20240920T213711381022", "iso8601_basic_short": "20240920T213711", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.41943359375, "5m": 0.42236328125, "15m": 0.212890625}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["peerlsr27", "eth0", "lo", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f550:87be:f736:b32a", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::f550:87be:f736:b32a"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::bb10:9a17:6b35:7604", "fe80::f550:87be:f736:b32a"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2851, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 866, "free": 2851}, "nocache": {"free": 3456, "used": 261}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 585, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205287936, "block_size": 4096, "block_total": 64483404, "block_available": 61329416, "block_used": 3153988, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882631.78822: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882631.78967: _low_level_execute_command(): starting 15794 1726882631.78979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882630.884878-16830-100278479249703/ > /dev/null 2>&1 && sleep 0' 15794 1726882631.80695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882631.80724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882631.80742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882631.80854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882631.80958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882631.82975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882631.83129: stderr chunk (state=3): >>><<< 15794 1726882631.83143: stdout chunk (state=3): >>><<< 15794 1726882631.83191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882631.83208: handler run complete 15794 1726882631.83432: variable 'ansible_facts' from source: unknown 15794 1726882631.83589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882631.84126: variable 'ansible_facts' from source: unknown 15794 1726882631.84315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882631.84485: attempt loop complete, returning result 15794 1726882631.84499: _execute() done 15794 1726882631.84509: dumping result to json 15794 1726882631.84558: done dumping result, returning 15794 1726882631.84571: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-00000000032b] 15794 1726882631.84582: sending task result for task 0affe814-3a2d-94e5-e48f-00000000032b ok: [managed_node1] 15794 1726882631.85815: no more pending results, returning what we have 15794 1726882631.85819: results queue empty 15794 1726882631.85820: checking for any_errors_fatal 15794 1726882631.85821: done checking for any_errors_fatal 15794 1726882631.85822: checking for max_fail_percentage 15794 1726882631.85824: done checking for max_fail_percentage 15794 1726882631.85825: checking to see if all hosts have failed and the running result is not ok 15794 1726882631.85825: done checking to see if all hosts have failed 15794 1726882631.85826: getting the remaining hosts for this loop 15794 1726882631.85828: done getting the remaining hosts for this loop 15794 1726882631.85832: getting the next task for host managed_node1 15794 1726882631.85840: done getting next task for host managed_node1 15794 1726882631.85842: ^ task is: TASK: meta (flush_handlers) 15794 1726882631.85844: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882631.85848: getting variables 15794 1726882631.85850: in VariableManager get_vars() 15794 1726882631.85888: Calling all_inventory to load vars for managed_node1 15794 1726882631.85892: Calling groups_inventory to load vars for managed_node1 15794 1726882631.85895: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882631.85903: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000032b 15794 1726882631.85907: WORKER PROCESS EXITING 15794 1726882631.85922: Calling all_plugins_play to load vars for managed_node1 15794 1726882631.85927: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882631.85933: Calling groups_plugins_play to load vars for managed_node1 15794 1726882631.88557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882631.91797: done with get_vars() 15794 1726882631.91837: done getting variables 15794 1726882631.91943: in VariableManager get_vars() 15794 1726882631.91957: Calling all_inventory to load vars for managed_node1 15794 1726882631.91959: Calling groups_inventory to load vars for managed_node1 15794 1726882631.91962: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882631.91972: Calling all_plugins_play to load vars for managed_node1 15794 1726882631.91975: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882631.91978: Calling groups_plugins_play to load vars for managed_node1 15794 1726882631.94075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882631.97131: done with get_vars() 15794 1726882631.97186: done queuing things up, now waiting for results queue to drain 15794 1726882631.97189: results queue empty 15794 1726882631.97190: checking for any_errors_fatal 15794 1726882631.97196: done checking for any_errors_fatal 15794 1726882631.97197: checking for max_fail_percentage 15794 1726882631.97199: done checking for max_fail_percentage 15794 1726882631.97200: checking to see if all hosts have failed and the running result is not ok 15794 1726882631.97201: done checking to see if all hosts have failed 15794 1726882631.97206: getting the remaining hosts for this loop 15794 1726882631.97207: done getting the remaining hosts for this loop 15794 1726882631.97211: getting the next task for host managed_node1 15794 1726882631.97216: done getting next task for host managed_node1 15794 1726882631.97220: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15794 1726882631.97222: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882631.97236: getting variables 15794 1726882631.97238: in VariableManager get_vars() 15794 1726882631.97262: Calling all_inventory to load vars for managed_node1 15794 1726882631.97265: Calling groups_inventory to load vars for managed_node1 15794 1726882631.97268: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882631.97275: Calling all_plugins_play to load vars for managed_node1 15794 1726882631.97278: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882631.97282: Calling groups_plugins_play to load vars for managed_node1 15794 1726882631.99691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882632.03343: done with get_vars() 15794 1726882632.03392: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:12 -0400 (0:00:01.214) 0:00:29.593 ****** 15794 1726882632.03531: entering _queue_task() for managed_node1/include_tasks 15794 1726882632.03988: worker is 1 (out of 1 available) 15794 1726882632.04040: exiting _queue_task() for managed_node1/include_tasks 15794 1726882632.04054: done queuing things up, now waiting for results queue to drain 15794 1726882632.04055: waiting for pending results... 15794 1726882632.04629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15794 1726882632.04980: in run() - task 0affe814-3a2d-94e5-e48f-00000000003c 15794 1726882632.05001: variable 'ansible_search_path' from source: unknown 15794 1726882632.05006: variable 'ansible_search_path' from source: unknown 15794 1726882632.05056: calling self._execute() 15794 1726882632.05541: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.05548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.05551: variable 'omit' from source: magic vars 15794 1726882632.06005: variable 'ansible_distribution_major_version' from source: facts 15794 1726882632.06049: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882632.06053: _execute() done 15794 1726882632.06056: dumping result to json 15794 1726882632.06070: done dumping result, returning 15794 1726882632.06085: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-94e5-e48f-00000000003c] 15794 1726882632.06089: sending task result for task 0affe814-3a2d-94e5-e48f-00000000003c 15794 1726882632.06349: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000003c 15794 1726882632.06353: WORKER PROCESS EXITING 15794 1726882632.06401: no more pending results, returning what we have 15794 1726882632.06407: in VariableManager get_vars() 15794 1726882632.06452: Calling all_inventory to load vars for managed_node1 15794 1726882632.06455: Calling groups_inventory to load vars for managed_node1 15794 1726882632.06458: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882632.06468: Calling all_plugins_play to load vars for managed_node1 15794 1726882632.06471: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882632.06474: Calling groups_plugins_play to load vars for managed_node1 15794 1726882632.10421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882632.15293: done with get_vars() 15794 1726882632.15341: variable 'ansible_search_path' from source: unknown 15794 1726882632.15344: variable 'ansible_search_path' from source: unknown 15794 1726882632.15382: we have included files to process 15794 1726882632.15383: generating all_blocks data 15794 1726882632.15385: done generating all_blocks data 15794 1726882632.15386: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882632.15387: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882632.15390: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882632.17080: done processing included file 15794 1726882632.17082: iterating over new_blocks loaded from include file 15794 1726882632.17084: in VariableManager get_vars() 15794 1726882632.17405: done with get_vars() 15794 1726882632.17408: filtering new block on tags 15794 1726882632.17537: done filtering new block on tags 15794 1726882632.17541: in VariableManager get_vars() 15794 1726882632.17673: done with get_vars() 15794 1726882632.17675: filtering new block on tags 15794 1726882632.17784: done filtering new block on tags 15794 1726882632.17820: in VariableManager get_vars() 15794 1726882632.17850: done with get_vars() 15794 1726882632.17852: filtering new block on tags 15794 1726882632.17874: done filtering new block on tags 15794 1726882632.17877: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15794 1726882632.17883: extending task lists for all hosts with included blocks 15794 1726882632.18911: done extending task lists 15794 1726882632.18913: done processing included files 15794 1726882632.18914: results queue empty 15794 1726882632.18915: checking for any_errors_fatal 15794 1726882632.18916: done checking for any_errors_fatal 15794 1726882632.18917: checking for max_fail_percentage 15794 1726882632.18968: done checking for max_fail_percentage 15794 1726882632.18970: checking to see if all hosts have failed and the running result is not ok 15794 1726882632.18971: done checking to see if all hosts have failed 15794 1726882632.18972: getting the remaining hosts for this loop 15794 1726882632.18974: done getting the remaining hosts for this loop 15794 1726882632.18996: getting the next task for host managed_node1 15794 1726882632.19002: done getting next task for host managed_node1 15794 1726882632.19005: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15794 1726882632.19008: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882632.19019: getting variables 15794 1726882632.19020: in VariableManager get_vars() 15794 1726882632.19045: Calling all_inventory to load vars for managed_node1 15794 1726882632.19049: Calling groups_inventory to load vars for managed_node1 15794 1726882632.19051: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882632.19058: Calling all_plugins_play to load vars for managed_node1 15794 1726882632.19061: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882632.19065: Calling groups_plugins_play to load vars for managed_node1 15794 1726882632.22987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882632.25951: done with get_vars() 15794 1726882632.25985: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:12 -0400 (0:00:00.225) 0:00:29.818 ****** 15794 1726882632.26089: entering _queue_task() for managed_node1/setup 15794 1726882632.26465: worker is 1 (out of 1 available) 15794 1726882632.26479: exiting _queue_task() for managed_node1/setup 15794 1726882632.26495: done queuing things up, now waiting for results queue to drain 15794 1726882632.26496: waiting for pending results... 15794 1726882632.26803: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15794 1726882632.26953: in run() - task 0affe814-3a2d-94e5-e48f-00000000036c 15794 1726882632.26958: variable 'ansible_search_path' from source: unknown 15794 1726882632.26961: variable 'ansible_search_path' from source: unknown 15794 1726882632.27040: calling self._execute() 15794 1726882632.27129: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.27140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.27214: variable 'omit' from source: magic vars 15794 1726882632.27957: variable 'ansible_distribution_major_version' from source: facts 15794 1726882632.28084: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882632.29042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882632.31469: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882632.31588: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882632.31674: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882632.31715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882632.31786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882632.31931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882632.32005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882632.32054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882632.32124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882632.32156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882632.32250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882632.32294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882632.32343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882632.32411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882632.32445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882632.32709: variable '__network_required_facts' from source: role '' defaults 15794 1726882632.32726: variable 'ansible_facts' from source: unknown 15794 1726882632.34077: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15794 1726882632.34104: when evaluation is False, skipping this task 15794 1726882632.34107: _execute() done 15794 1726882632.34109: dumping result to json 15794 1726882632.34181: done dumping result, returning 15794 1726882632.34185: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-94e5-e48f-00000000036c] 15794 1726882632.34188: sending task result for task 0affe814-3a2d-94e5-e48f-00000000036c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882632.34593: no more pending results, returning what we have 15794 1726882632.34599: results queue empty 15794 1726882632.34600: checking for any_errors_fatal 15794 1726882632.34602: done checking for any_errors_fatal 15794 1726882632.34603: checking for max_fail_percentage 15794 1726882632.34605: done checking for max_fail_percentage 15794 1726882632.34606: checking to see if all hosts have failed and the running result is not ok 15794 1726882632.34607: done checking to see if all hosts have failed 15794 1726882632.34608: getting the remaining hosts for this loop 15794 1726882632.34611: done getting the remaining hosts for this loop 15794 1726882632.34615: getting the next task for host managed_node1 15794 1726882632.34632: done getting next task for host managed_node1 15794 1726882632.34843: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15794 1726882632.34848: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882632.34868: getting variables 15794 1726882632.34870: in VariableManager get_vars() 15794 1726882632.34920: Calling all_inventory to load vars for managed_node1 15794 1726882632.34924: Calling groups_inventory to load vars for managed_node1 15794 1726882632.34926: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882632.34950: Calling all_plugins_play to load vars for managed_node1 15794 1726882632.34955: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882632.34960: Calling groups_plugins_play to load vars for managed_node1 15794 1726882632.35566: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000036c 15794 1726882632.35570: WORKER PROCESS EXITING 15794 1726882632.37988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882632.41192: done with get_vars() 15794 1726882632.41240: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:12 -0400 (0:00:00.152) 0:00:29.971 ****** 15794 1726882632.41370: entering _queue_task() for managed_node1/stat 15794 1726882632.41840: worker is 1 (out of 1 available) 15794 1726882632.41853: exiting _queue_task() for managed_node1/stat 15794 1726882632.42057: done queuing things up, now waiting for results queue to drain 15794 1726882632.42059: waiting for pending results... 15794 1726882632.42320: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15794 1726882632.42511: in run() - task 0affe814-3a2d-94e5-e48f-00000000036e 15794 1726882632.42531: variable 'ansible_search_path' from source: unknown 15794 1726882632.42541: variable 'ansible_search_path' from source: unknown 15794 1726882632.42597: calling self._execute() 15794 1726882632.42720: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.42738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.42759: variable 'omit' from source: magic vars 15794 1726882632.43203: variable 'ansible_distribution_major_version' from source: facts 15794 1726882632.43215: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882632.43416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882632.43719: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882632.43829: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882632.43884: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882632.43939: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882632.44096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882632.44133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882632.44207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882632.44262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882632.44419: variable '__network_is_ostree' from source: set_fact 15794 1726882632.44432: Evaluated conditional (not __network_is_ostree is defined): False 15794 1726882632.44444: when evaluation is False, skipping this task 15794 1726882632.44452: _execute() done 15794 1726882632.44460: dumping result to json 15794 1726882632.44499: done dumping result, returning 15794 1726882632.44503: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-94e5-e48f-00000000036e] 15794 1726882632.44511: sending task result for task 0affe814-3a2d-94e5-e48f-00000000036e skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15794 1726882632.44697: no more pending results, returning what we have 15794 1726882632.44702: results queue empty 15794 1726882632.44703: checking for any_errors_fatal 15794 1726882632.44712: done checking for any_errors_fatal 15794 1726882632.44713: checking for max_fail_percentage 15794 1726882632.44715: done checking for max_fail_percentage 15794 1726882632.44716: checking to see if all hosts have failed and the running result is not ok 15794 1726882632.44717: done checking to see if all hosts have failed 15794 1726882632.44719: getting the remaining hosts for this loop 15794 1726882632.44721: done getting the remaining hosts for this loop 15794 1726882632.44727: getting the next task for host managed_node1 15794 1726882632.44737: done getting next task for host managed_node1 15794 1726882632.44742: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15794 1726882632.44747: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882632.44767: getting variables 15794 1726882632.44769: in VariableManager get_vars() 15794 1726882632.44816: Calling all_inventory to load vars for managed_node1 15794 1726882632.44820: Calling groups_inventory to load vars for managed_node1 15794 1726882632.44823: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882632.45058: Calling all_plugins_play to load vars for managed_node1 15794 1726882632.45064: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882632.45071: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000036e 15794 1726882632.45074: WORKER PROCESS EXITING 15794 1726882632.45082: Calling groups_plugins_play to load vars for managed_node1 15794 1726882632.47943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882632.50909: done with get_vars() 15794 1726882632.50947: done getting variables 15794 1726882632.51016: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:12 -0400 (0:00:00.096) 0:00:30.068 ****** 15794 1726882632.51058: entering _queue_task() for managed_node1/set_fact 15794 1726882632.51385: worker is 1 (out of 1 available) 15794 1726882632.51398: exiting _queue_task() for managed_node1/set_fact 15794 1726882632.51411: done queuing things up, now waiting for results queue to drain 15794 1726882632.51413: waiting for pending results... 15794 1726882632.51717: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15794 1726882632.51868: in run() - task 0affe814-3a2d-94e5-e48f-00000000036f 15794 1726882632.51881: variable 'ansible_search_path' from source: unknown 15794 1726882632.51885: variable 'ansible_search_path' from source: unknown 15794 1726882632.51922: calling self._execute() 15794 1726882632.52028: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.52038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.52051: variable 'omit' from source: magic vars 15794 1726882632.52482: variable 'ansible_distribution_major_version' from source: facts 15794 1726882632.52493: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882632.52700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882632.53004: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882632.53061: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882632.53104: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882632.53146: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882632.53245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882632.53280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882632.53311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882632.53351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882632.53458: variable '__network_is_ostree' from source: set_fact 15794 1726882632.53466: Evaluated conditional (not __network_is_ostree is defined): False 15794 1726882632.53470: when evaluation is False, skipping this task 15794 1726882632.53473: _execute() done 15794 1726882632.53497: dumping result to json 15794 1726882632.53507: done dumping result, returning 15794 1726882632.53511: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-94e5-e48f-00000000036f] 15794 1726882632.53514: sending task result for task 0affe814-3a2d-94e5-e48f-00000000036f 15794 1726882632.53684: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000036f 15794 1726882632.53688: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15794 1726882632.53742: no more pending results, returning what we have 15794 1726882632.53747: results queue empty 15794 1726882632.53748: checking for any_errors_fatal 15794 1726882632.53755: done checking for any_errors_fatal 15794 1726882632.53756: checking for max_fail_percentage 15794 1726882632.53758: done checking for max_fail_percentage 15794 1726882632.53759: checking to see if all hosts have failed and the running result is not ok 15794 1726882632.53760: done checking to see if all hosts have failed 15794 1726882632.53761: getting the remaining hosts for this loop 15794 1726882632.53764: done getting the remaining hosts for this loop 15794 1726882632.53768: getting the next task for host managed_node1 15794 1726882632.53778: done getting next task for host managed_node1 15794 1726882632.53782: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15794 1726882632.53785: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882632.53801: getting variables 15794 1726882632.53803: in VariableManager get_vars() 15794 1726882632.53847: Calling all_inventory to load vars for managed_node1 15794 1726882632.53851: Calling groups_inventory to load vars for managed_node1 15794 1726882632.53855: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882632.53866: Calling all_plugins_play to load vars for managed_node1 15794 1726882632.53871: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882632.53875: Calling groups_plugins_play to load vars for managed_node1 15794 1726882632.56182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882632.59226: done with get_vars() 15794 1726882632.59266: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:12 -0400 (0:00:00.083) 0:00:30.151 ****** 15794 1726882632.59377: entering _queue_task() for managed_node1/service_facts 15794 1726882632.59719: worker is 1 (out of 1 available) 15794 1726882632.59732: exiting _queue_task() for managed_node1/service_facts 15794 1726882632.59750: done queuing things up, now waiting for results queue to drain 15794 1726882632.59752: waiting for pending results... 15794 1726882632.60153: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15794 1726882632.60305: in run() - task 0affe814-3a2d-94e5-e48f-000000000371 15794 1726882632.60310: variable 'ansible_search_path' from source: unknown 15794 1726882632.60313: variable 'ansible_search_path' from source: unknown 15794 1726882632.60340: calling self._execute() 15794 1726882632.60460: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.60475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.60497: variable 'omit' from source: magic vars 15794 1726882632.61000: variable 'ansible_distribution_major_version' from source: facts 15794 1726882632.61061: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882632.61069: variable 'omit' from source: magic vars 15794 1726882632.61117: variable 'omit' from source: magic vars 15794 1726882632.61171: variable 'omit' from source: magic vars 15794 1726882632.61388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882632.61425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882632.61451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882632.61473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882632.61486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882632.61525: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882632.61529: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.61535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.61659: Set connection var ansible_connection to ssh 15794 1726882632.61716: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882632.61725: Set connection var ansible_pipelining to False 15794 1726882632.61728: Set connection var ansible_shell_executable to /bin/sh 15794 1726882632.61732: Set connection var ansible_shell_type to sh 15794 1726882632.61740: Set connection var ansible_timeout to 10 15794 1726882632.61743: variable 'ansible_shell_executable' from source: unknown 15794 1726882632.61746: variable 'ansible_connection' from source: unknown 15794 1726882632.61751: variable 'ansible_module_compression' from source: unknown 15794 1726882632.61754: variable 'ansible_shell_type' from source: unknown 15794 1726882632.61759: variable 'ansible_shell_executable' from source: unknown 15794 1726882632.61825: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882632.61830: variable 'ansible_pipelining' from source: unknown 15794 1726882632.61833: variable 'ansible_timeout' from source: unknown 15794 1726882632.61835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882632.62006: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882632.62041: variable 'omit' from source: magic vars 15794 1726882632.62044: starting attempt loop 15794 1726882632.62047: running the handler 15794 1726882632.62050: _low_level_execute_command(): starting 15794 1726882632.62052: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882632.62957: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882632.62960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882632.62963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882632.62967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882632.62988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882632.63090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882632.64829: stdout chunk (state=3): >>>/root <<< 15794 1726882632.65026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882632.65030: stdout chunk (state=3): >>><<< 15794 1726882632.65032: stderr chunk (state=3): >>><<< 15794 1726882632.65053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882632.65073: _low_level_execute_command(): starting 15794 1726882632.65084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452 `" && echo ansible-tmp-1726882632.650606-16897-131939506307452="` echo /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452 `" ) && sleep 0' 15794 1726882632.65833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.65932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882632.65977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882632.66004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882632.66091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882632.68102: stdout chunk (state=3): >>>ansible-tmp-1726882632.650606-16897-131939506307452=/root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452 <<< 15794 1726882632.68286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882632.68640: stderr chunk (state=3): >>><<< 15794 1726882632.68644: stdout chunk (state=3): >>><<< 15794 1726882632.68647: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882632.650606-16897-131939506307452=/root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882632.68649: variable 'ansible_module_compression' from source: unknown 15794 1726882632.68652: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15794 1726882632.68654: variable 'ansible_facts' from source: unknown 15794 1726882632.68899: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py 15794 1726882632.69388: Sending initial data 15794 1726882632.69392: Sent initial data (161 bytes) 15794 1726882632.70384: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882632.70398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.70524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882632.70542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.70610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882632.70746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882632.70832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882632.72458: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882632.72508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882632.72560: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpwth5ys1m /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py <<< 15794 1726882632.72590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py" <<< 15794 1726882632.72687: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpwth5ys1m" to remote "/root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py" <<< 15794 1726882632.74142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882632.74201: stderr chunk (state=3): >>><<< 15794 1726882632.74214: stdout chunk (state=3): >>><<< 15794 1726882632.74256: done transferring module to remote 15794 1726882632.74276: _low_level_execute_command(): starting 15794 1726882632.74288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/ /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py && sleep 0' 15794 1726882632.75377: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882632.75393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882632.75419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882632.75449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882632.75484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.75533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.75608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882632.75649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882632.75675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882632.75788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882632.77658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882632.77677: stdout chunk (state=3): >>><<< 15794 1726882632.77694: stderr chunk (state=3): >>><<< 15794 1726882632.77716: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882632.77726: _low_level_execute_command(): starting 15794 1726882632.77740: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/AnsiballZ_service_facts.py && sleep 0' 15794 1726882632.78356: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882632.78434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882632.78441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882632.78446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882632.78449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882632.78451: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882632.78454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.78456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882632.78459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882632.78461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15794 1726882632.78470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882632.78476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882632.78495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882632.78504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882632.78513: stderr chunk (state=3): >>>debug2: match found <<< 15794 1726882632.78525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882632.78629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882632.78633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882632.78706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882634.67291: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "sta<<< 15794 1726882634.67306: stdout chunk (state=3): >>>te": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zr<<< 15794 1726882634.67329: stdout chunk (state=3): >>>am0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15794 1726882634.68956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882634.68960: stdout chunk (state=3): >>><<< 15794 1726882634.68962: stderr chunk (state=3): >>><<< 15794 1726882634.69247: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882634.71388: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882634.71540: _low_level_execute_command(): starting 15794 1726882634.71555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882632.650606-16897-131939506307452/ > /dev/null 2>&1 && sleep 0' 15794 1726882634.73002: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882634.73016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882634.73028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882634.73096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882634.73109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882634.73136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882634.73260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882634.75245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882634.75365: stderr chunk (state=3): >>><<< 15794 1726882634.75368: stdout chunk (state=3): >>><<< 15794 1726882634.75384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882634.75744: handler run complete 15794 1726882634.75949: variable 'ansible_facts' from source: unknown 15794 1726882634.76470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882634.78169: variable 'ansible_facts' from source: unknown 15794 1726882634.78620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882634.79576: attempt loop complete, returning result 15794 1726882634.79709: _execute() done 15794 1726882634.79844: dumping result to json 15794 1726882634.80087: done dumping result, returning 15794 1726882634.80152: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-94e5-e48f-000000000371] 15794 1726882634.80181: sending task result for task 0affe814-3a2d-94e5-e48f-000000000371 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882634.83795: no more pending results, returning what we have 15794 1726882634.83798: results queue empty 15794 1726882634.83800: checking for any_errors_fatal 15794 1726882634.83805: done checking for any_errors_fatal 15794 1726882634.83806: checking for max_fail_percentage 15794 1726882634.83809: done checking for max_fail_percentage 15794 1726882634.83810: checking to see if all hosts have failed and the running result is not ok 15794 1726882634.83811: done checking to see if all hosts have failed 15794 1726882634.83812: getting the remaining hosts for this loop 15794 1726882634.83814: done getting the remaining hosts for this loop 15794 1726882634.83818: getting the next task for host managed_node1 15794 1726882634.83824: done getting next task for host managed_node1 15794 1726882634.83828: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15794 1726882634.83831: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882634.83844: getting variables 15794 1726882634.83846: in VariableManager get_vars() 15794 1726882634.83885: Calling all_inventory to load vars for managed_node1 15794 1726882634.83889: Calling groups_inventory to load vars for managed_node1 15794 1726882634.83892: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882634.83903: Calling all_plugins_play to load vars for managed_node1 15794 1726882634.83907: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882634.83911: Calling groups_plugins_play to load vars for managed_node1 15794 1726882634.85040: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000371 15794 1726882634.85044: WORKER PROCESS EXITING 15794 1726882634.90869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882634.99745: done with get_vars() 15794 1726882634.99787: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:15 -0400 (0:00:02.408) 0:00:32.560 ****** 15794 1726882635.00264: entering _queue_task() for managed_node1/package_facts 15794 1726882635.00933: worker is 1 (out of 1 available) 15794 1726882635.00947: exiting _queue_task() for managed_node1/package_facts 15794 1726882635.00958: done queuing things up, now waiting for results queue to drain 15794 1726882635.00959: waiting for pending results... 15794 1726882635.01175: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15794 1726882635.01348: in run() - task 0affe814-3a2d-94e5-e48f-000000000372 15794 1726882635.01373: variable 'ansible_search_path' from source: unknown 15794 1726882635.01386: variable 'ansible_search_path' from source: unknown 15794 1726882635.01440: calling self._execute() 15794 1726882635.01551: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882635.01567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882635.01588: variable 'omit' from source: magic vars 15794 1726882635.02049: variable 'ansible_distribution_major_version' from source: facts 15794 1726882635.02072: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882635.02088: variable 'omit' from source: magic vars 15794 1726882635.02173: variable 'omit' from source: magic vars 15794 1726882635.02224: variable 'omit' from source: magic vars 15794 1726882635.02284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882635.02330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882635.02362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882635.02498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882635.02538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882635.02583: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882635.02642: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882635.02676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882635.03033: Set connection var ansible_connection to ssh 15794 1726882635.03141: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882635.03145: Set connection var ansible_pipelining to False 15794 1726882635.03148: Set connection var ansible_shell_executable to /bin/sh 15794 1726882635.03151: Set connection var ansible_shell_type to sh 15794 1726882635.03153: Set connection var ansible_timeout to 10 15794 1726882635.03156: variable 'ansible_shell_executable' from source: unknown 15794 1726882635.03157: variable 'ansible_connection' from source: unknown 15794 1726882635.03160: variable 'ansible_module_compression' from source: unknown 15794 1726882635.03162: variable 'ansible_shell_type' from source: unknown 15794 1726882635.03164: variable 'ansible_shell_executable' from source: unknown 15794 1726882635.03166: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882635.03339: variable 'ansible_pipelining' from source: unknown 15794 1726882635.03342: variable 'ansible_timeout' from source: unknown 15794 1726882635.03344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882635.03841: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882635.03846: variable 'omit' from source: magic vars 15794 1726882635.03849: starting attempt loop 15794 1726882635.03852: running the handler 15794 1726882635.03854: _low_level_execute_command(): starting 15794 1726882635.03857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882635.05217: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882635.05486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882635.05618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882635.05715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882635.07485: stdout chunk (state=3): >>>/root <<< 15794 1726882635.07652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882635.07767: stderr chunk (state=3): >>><<< 15794 1726882635.07782: stdout chunk (state=3): >>><<< 15794 1726882635.07810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882635.07832: _low_level_execute_command(): starting 15794 1726882635.07908: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666 `" && echo ansible-tmp-1726882635.078167-17001-74774362234666="` echo /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666 `" ) && sleep 0' 15794 1726882635.09209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882635.09284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882635.09445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882635.09531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882635.11546: stdout chunk (state=3): >>>ansible-tmp-1726882635.078167-17001-74774362234666=/root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666 <<< 15794 1726882635.11814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882635.11819: stdout chunk (state=3): >>><<< 15794 1726882635.11832: stderr chunk (state=3): >>><<< 15794 1726882635.11854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882635.078167-17001-74774362234666=/root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882635.12033: variable 'ansible_module_compression' from source: unknown 15794 1726882635.12081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15794 1726882635.12268: variable 'ansible_facts' from source: unknown 15794 1726882635.12521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py 15794 1726882635.13041: Sending initial data 15794 1726882635.13044: Sent initial data (160 bytes) 15794 1726882635.14153: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882635.14249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882635.14263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882635.14331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882635.14476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882635.14551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882635.14625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882635.16294: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882635.16417: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882635.16472: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp9fnv3d1n /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py <<< 15794 1726882635.16496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py" <<< 15794 1726882635.16648: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp9fnv3d1n" to remote "/root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py" <<< 15794 1726882635.21589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882635.21593: stdout chunk (state=3): >>><<< 15794 1726882635.21595: stderr chunk (state=3): >>><<< 15794 1726882635.21957: done transferring module to remote 15794 1726882635.21961: _low_level_execute_command(): starting 15794 1726882635.21964: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/ /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py && sleep 0' 15794 1726882635.23129: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882635.23183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882635.23208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882635.23232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882635.23327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882635.25405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882635.25409: stdout chunk (state=3): >>><<< 15794 1726882635.25411: stderr chunk (state=3): >>><<< 15794 1726882635.25425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882635.25442: _low_level_execute_command(): starting 15794 1726882635.25453: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/AnsiballZ_package_facts.py && sleep 0' 15794 1726882635.26851: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882635.26878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882635.27104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882635.27124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882635.27224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882635.90612: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 15794 1726882635.90708: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 15794 1726882635.90834: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 15794 1726882635.90851: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15794 1726882635.92868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882635.92871: stdout chunk (state=3): >>><<< 15794 1726882635.92874: stderr chunk (state=3): >>><<< 15794 1726882635.92892: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882635.98645: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882635.98669: _low_level_execute_command(): starting 15794 1726882635.98726: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882635.078167-17001-74774362234666/ > /dev/null 2>&1 && sleep 0' 15794 1726882635.99394: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882635.99427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882635.99512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882635.99629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882635.99651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882635.99744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882636.01738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882636.01751: stdout chunk (state=3): >>><<< 15794 1726882636.01841: stderr chunk (state=3): >>><<< 15794 1726882636.01849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882636.01853: handler run complete 15794 1726882636.03647: variable 'ansible_facts' from source: unknown 15794 1726882636.05841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.09801: variable 'ansible_facts' from source: unknown 15794 1726882636.11119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.14009: attempt loop complete, returning result 15794 1726882636.14030: _execute() done 15794 1726882636.14041: dumping result to json 15794 1726882636.14414: done dumping result, returning 15794 1726882636.14426: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-94e5-e48f-000000000372] 15794 1726882636.14432: sending task result for task 0affe814-3a2d-94e5-e48f-000000000372 15794 1726882636.18720: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000372 15794 1726882636.18724: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882636.18909: no more pending results, returning what we have 15794 1726882636.18913: results queue empty 15794 1726882636.18914: checking for any_errors_fatal 15794 1726882636.18920: done checking for any_errors_fatal 15794 1726882636.18921: checking for max_fail_percentage 15794 1726882636.18923: done checking for max_fail_percentage 15794 1726882636.18924: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.18925: done checking to see if all hosts have failed 15794 1726882636.18926: getting the remaining hosts for this loop 15794 1726882636.18927: done getting the remaining hosts for this loop 15794 1726882636.18931: getting the next task for host managed_node1 15794 1726882636.19044: done getting next task for host managed_node1 15794 1726882636.19050: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15794 1726882636.19052: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.19081: getting variables 15794 1726882636.19084: in VariableManager get_vars() 15794 1726882636.19119: Calling all_inventory to load vars for managed_node1 15794 1726882636.19122: Calling groups_inventory to load vars for managed_node1 15794 1726882636.19125: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.19137: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.19141: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.19145: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.22586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.26043: done with get_vars() 15794 1726882636.26079: done getting variables 15794 1726882636.26130: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:16 -0400 (0:00:01.258) 0:00:33.819 ****** 15794 1726882636.26164: entering _queue_task() for managed_node1/debug 15794 1726882636.26430: worker is 1 (out of 1 available) 15794 1726882636.26446: exiting _queue_task() for managed_node1/debug 15794 1726882636.26459: done queuing things up, now waiting for results queue to drain 15794 1726882636.26461: waiting for pending results... 15794 1726882636.26664: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15794 1726882636.26750: in run() - task 0affe814-3a2d-94e5-e48f-00000000003d 15794 1726882636.26763: variable 'ansible_search_path' from source: unknown 15794 1726882636.26768: variable 'ansible_search_path' from source: unknown 15794 1726882636.26806: calling self._execute() 15794 1726882636.26883: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.26893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.26909: variable 'omit' from source: magic vars 15794 1726882636.27246: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.27257: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.27264: variable 'omit' from source: magic vars 15794 1726882636.27299: variable 'omit' from source: magic vars 15794 1726882636.27389: variable 'network_provider' from source: set_fact 15794 1726882636.27403: variable 'omit' from source: magic vars 15794 1726882636.27440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882636.27473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882636.27495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882636.27511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882636.27521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882636.27552: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882636.27558: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.27561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.27646: Set connection var ansible_connection to ssh 15794 1726882636.27654: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882636.27662: Set connection var ansible_pipelining to False 15794 1726882636.27669: Set connection var ansible_shell_executable to /bin/sh 15794 1726882636.27674: Set connection var ansible_shell_type to sh 15794 1726882636.27690: Set connection var ansible_timeout to 10 15794 1726882636.27712: variable 'ansible_shell_executable' from source: unknown 15794 1726882636.27717: variable 'ansible_connection' from source: unknown 15794 1726882636.27720: variable 'ansible_module_compression' from source: unknown 15794 1726882636.27722: variable 'ansible_shell_type' from source: unknown 15794 1726882636.27727: variable 'ansible_shell_executable' from source: unknown 15794 1726882636.27729: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.27736: variable 'ansible_pipelining' from source: unknown 15794 1726882636.27739: variable 'ansible_timeout' from source: unknown 15794 1726882636.27745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.27868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882636.27879: variable 'omit' from source: magic vars 15794 1726882636.27888: starting attempt loop 15794 1726882636.27891: running the handler 15794 1726882636.27936: handler run complete 15794 1726882636.27949: attempt loop complete, returning result 15794 1726882636.27952: _execute() done 15794 1726882636.27955: dumping result to json 15794 1726882636.27960: done dumping result, returning 15794 1726882636.27968: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-94e5-e48f-00000000003d] 15794 1726882636.27974: sending task result for task 0affe814-3a2d-94e5-e48f-00000000003d 15794 1726882636.28063: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000003d 15794 1726882636.28066: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15794 1726882636.28131: no more pending results, returning what we have 15794 1726882636.28138: results queue empty 15794 1726882636.28139: checking for any_errors_fatal 15794 1726882636.28151: done checking for any_errors_fatal 15794 1726882636.28152: checking for max_fail_percentage 15794 1726882636.28153: done checking for max_fail_percentage 15794 1726882636.28155: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.28155: done checking to see if all hosts have failed 15794 1726882636.28156: getting the remaining hosts for this loop 15794 1726882636.28159: done getting the remaining hosts for this loop 15794 1726882636.28163: getting the next task for host managed_node1 15794 1726882636.28169: done getting next task for host managed_node1 15794 1726882636.28173: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15794 1726882636.28177: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.28188: getting variables 15794 1726882636.28190: in VariableManager get_vars() 15794 1726882636.28223: Calling all_inventory to load vars for managed_node1 15794 1726882636.28227: Calling groups_inventory to load vars for managed_node1 15794 1726882636.28230: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.28249: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.28253: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.28257: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.30436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.32560: done with get_vars() 15794 1726882636.32583: done getting variables 15794 1726882636.32644: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:16 -0400 (0:00:00.065) 0:00:33.884 ****** 15794 1726882636.32680: entering _queue_task() for managed_node1/fail 15794 1726882636.32978: worker is 1 (out of 1 available) 15794 1726882636.32993: exiting _queue_task() for managed_node1/fail 15794 1726882636.33007: done queuing things up, now waiting for results queue to drain 15794 1726882636.33008: waiting for pending results... 15794 1726882636.33267: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15794 1726882636.33414: in run() - task 0affe814-3a2d-94e5-e48f-00000000003e 15794 1726882636.33419: variable 'ansible_search_path' from source: unknown 15794 1726882636.33422: variable 'ansible_search_path' from source: unknown 15794 1726882636.33482: calling self._execute() 15794 1726882636.33557: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.33569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.33572: variable 'omit' from source: magic vars 15794 1726882636.33964: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.33974: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.34082: variable 'network_state' from source: role '' defaults 15794 1726882636.34095: Evaluated conditional (network_state != {}): False 15794 1726882636.34099: when evaluation is False, skipping this task 15794 1726882636.34102: _execute() done 15794 1726882636.34107: dumping result to json 15794 1726882636.34109: done dumping result, returning 15794 1726882636.34125: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-94e5-e48f-00000000003e] 15794 1726882636.34138: sending task result for task 0affe814-3a2d-94e5-e48f-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882636.34359: no more pending results, returning what we have 15794 1726882636.34362: results queue empty 15794 1726882636.34364: checking for any_errors_fatal 15794 1726882636.34371: done checking for any_errors_fatal 15794 1726882636.34372: checking for max_fail_percentage 15794 1726882636.34374: done checking for max_fail_percentage 15794 1726882636.34375: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.34376: done checking to see if all hosts have failed 15794 1726882636.34377: getting the remaining hosts for this loop 15794 1726882636.34378: done getting the remaining hosts for this loop 15794 1726882636.34382: getting the next task for host managed_node1 15794 1726882636.34390: done getting next task for host managed_node1 15794 1726882636.34395: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15794 1726882636.34397: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.34412: getting variables 15794 1726882636.34413: in VariableManager get_vars() 15794 1726882636.34467: Calling all_inventory to load vars for managed_node1 15794 1726882636.34470: Calling groups_inventory to load vars for managed_node1 15794 1726882636.34473: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.34482: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.34484: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.34491: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.35533: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000003e 15794 1726882636.35538: WORKER PROCESS EXITING 15794 1726882636.36079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.38450: done with get_vars() 15794 1726882636.38481: done getting variables 15794 1726882636.38529: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:16 -0400 (0:00:00.058) 0:00:33.943 ****** 15794 1726882636.38557: entering _queue_task() for managed_node1/fail 15794 1726882636.38822: worker is 1 (out of 1 available) 15794 1726882636.38837: exiting _queue_task() for managed_node1/fail 15794 1726882636.38849: done queuing things up, now waiting for results queue to drain 15794 1726882636.38851: waiting for pending results... 15794 1726882636.39039: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15794 1726882636.39119: in run() - task 0affe814-3a2d-94e5-e48f-00000000003f 15794 1726882636.39132: variable 'ansible_search_path' from source: unknown 15794 1726882636.39137: variable 'ansible_search_path' from source: unknown 15794 1726882636.39169: calling self._execute() 15794 1726882636.39251: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.39257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.39267: variable 'omit' from source: magic vars 15794 1726882636.39601: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.39610: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.39719: variable 'network_state' from source: role '' defaults 15794 1726882636.39734: Evaluated conditional (network_state != {}): False 15794 1726882636.39741: when evaluation is False, skipping this task 15794 1726882636.39744: _execute() done 15794 1726882636.39747: dumping result to json 15794 1726882636.39750: done dumping result, returning 15794 1726882636.39753: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-94e5-e48f-00000000003f] 15794 1726882636.39762: sending task result for task 0affe814-3a2d-94e5-e48f-00000000003f 15794 1726882636.39856: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000003f 15794 1726882636.39859: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882636.39918: no more pending results, returning what we have 15794 1726882636.39922: results queue empty 15794 1726882636.39923: checking for any_errors_fatal 15794 1726882636.39929: done checking for any_errors_fatal 15794 1726882636.39930: checking for max_fail_percentage 15794 1726882636.39932: done checking for max_fail_percentage 15794 1726882636.39933: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.39936: done checking to see if all hosts have failed 15794 1726882636.39937: getting the remaining hosts for this loop 15794 1726882636.39939: done getting the remaining hosts for this loop 15794 1726882636.39943: getting the next task for host managed_node1 15794 1726882636.39948: done getting next task for host managed_node1 15794 1726882636.39952: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15794 1726882636.39954: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.39968: getting variables 15794 1726882636.39970: in VariableManager get_vars() 15794 1726882636.40005: Calling all_inventory to load vars for managed_node1 15794 1726882636.40008: Calling groups_inventory to load vars for managed_node1 15794 1726882636.40010: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.40020: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.40023: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.40027: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.45939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.48059: done with get_vars() 15794 1726882636.48098: done getting variables 15794 1726882636.48156: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:16 -0400 (0:00:00.096) 0:00:34.039 ****** 15794 1726882636.48185: entering _queue_task() for managed_node1/fail 15794 1726882636.48548: worker is 1 (out of 1 available) 15794 1726882636.48563: exiting _queue_task() for managed_node1/fail 15794 1726882636.48581: done queuing things up, now waiting for results queue to drain 15794 1726882636.48583: waiting for pending results... 15794 1726882636.48846: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15794 1726882636.48926: in run() - task 0affe814-3a2d-94e5-e48f-000000000040 15794 1726882636.48940: variable 'ansible_search_path' from source: unknown 15794 1726882636.48944: variable 'ansible_search_path' from source: unknown 15794 1726882636.48982: calling self._execute() 15794 1726882636.49059: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.49067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.49082: variable 'omit' from source: magic vars 15794 1726882636.49474: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.49480: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.49649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882636.51598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882636.51662: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882636.51708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882636.51738: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882636.51764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882636.51835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.51860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.51887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.51925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.51956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.52066: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.52076: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15794 1726882636.52210: variable 'ansible_distribution' from source: facts 15794 1726882636.52214: variable '__network_rh_distros' from source: role '' defaults 15794 1726882636.52217: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15794 1726882636.52220: when evaluation is False, skipping this task 15794 1726882636.52222: _execute() done 15794 1726882636.52225: dumping result to json 15794 1726882636.52227: done dumping result, returning 15794 1726882636.52247: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-94e5-e48f-000000000040] 15794 1726882636.52250: sending task result for task 0affe814-3a2d-94e5-e48f-000000000040 15794 1726882636.52354: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000040 15794 1726882636.52356: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15794 1726882636.52415: no more pending results, returning what we have 15794 1726882636.52419: results queue empty 15794 1726882636.52425: checking for any_errors_fatal 15794 1726882636.52463: done checking for any_errors_fatal 15794 1726882636.52465: checking for max_fail_percentage 15794 1726882636.52497: done checking for max_fail_percentage 15794 1726882636.52499: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.52500: done checking to see if all hosts have failed 15794 1726882636.52500: getting the remaining hosts for this loop 15794 1726882636.52502: done getting the remaining hosts for this loop 15794 1726882636.52506: getting the next task for host managed_node1 15794 1726882636.52512: done getting next task for host managed_node1 15794 1726882636.52516: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15794 1726882636.52518: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.52566: getting variables 15794 1726882636.52568: in VariableManager get_vars() 15794 1726882636.52606: Calling all_inventory to load vars for managed_node1 15794 1726882636.52609: Calling groups_inventory to load vars for managed_node1 15794 1726882636.52611: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.52621: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.52624: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.52627: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.54201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.56168: done with get_vars() 15794 1726882636.56191: done getting variables 15794 1726882636.56266: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:16 -0400 (0:00:00.081) 0:00:34.120 ****** 15794 1726882636.56301: entering _queue_task() for managed_node1/dnf 15794 1726882636.56564: worker is 1 (out of 1 available) 15794 1726882636.56582: exiting _queue_task() for managed_node1/dnf 15794 1726882636.56596: done queuing things up, now waiting for results queue to drain 15794 1726882636.56598: waiting for pending results... 15794 1726882636.56831: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15794 1726882636.56912: in run() - task 0affe814-3a2d-94e5-e48f-000000000041 15794 1726882636.56929: variable 'ansible_search_path' from source: unknown 15794 1726882636.56933: variable 'ansible_search_path' from source: unknown 15794 1726882636.56981: calling self._execute() 15794 1726882636.57083: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.57092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.57112: variable 'omit' from source: magic vars 15794 1726882636.57502: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.57514: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.57725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882636.59797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882636.59857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882636.59907: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882636.59947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882636.59976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882636.60038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.60066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.60105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.60167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.60173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.60309: variable 'ansible_distribution' from source: facts 15794 1726882636.60313: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.60316: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15794 1726882636.60424: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882636.60606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.60669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.60673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.60719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.60735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.60774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.60800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.60829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.60867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.60881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.60916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.60949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.61020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.61043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.61054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.61189: variable 'network_connections' from source: play vars 15794 1726882636.61198: variable 'profile' from source: play vars 15794 1726882636.61285: variable 'profile' from source: play vars 15794 1726882636.61295: variable 'interface' from source: set_fact 15794 1726882636.61360: variable 'interface' from source: set_fact 15794 1726882636.61438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882636.61598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882636.61632: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882636.61663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882636.61708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882636.61744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882636.61763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882636.61792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.61815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882636.61888: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882636.62111: variable 'network_connections' from source: play vars 15794 1726882636.62115: variable 'profile' from source: play vars 15794 1726882636.62194: variable 'profile' from source: play vars 15794 1726882636.62197: variable 'interface' from source: set_fact 15794 1726882636.62271: variable 'interface' from source: set_fact 15794 1726882636.62275: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882636.62281: when evaluation is False, skipping this task 15794 1726882636.62284: _execute() done 15794 1726882636.62286: dumping result to json 15794 1726882636.62288: done dumping result, returning 15794 1726882636.62302: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000041] 15794 1726882636.62314: sending task result for task 0affe814-3a2d-94e5-e48f-000000000041 15794 1726882636.62421: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000041 15794 1726882636.62424: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882636.62484: no more pending results, returning what we have 15794 1726882636.62488: results queue empty 15794 1726882636.62489: checking for any_errors_fatal 15794 1726882636.62495: done checking for any_errors_fatal 15794 1726882636.62496: checking for max_fail_percentage 15794 1726882636.62497: done checking for max_fail_percentage 15794 1726882636.62498: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.62499: done checking to see if all hosts have failed 15794 1726882636.62500: getting the remaining hosts for this loop 15794 1726882636.62503: done getting the remaining hosts for this loop 15794 1726882636.62544: getting the next task for host managed_node1 15794 1726882636.62550: done getting next task for host managed_node1 15794 1726882636.62555: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15794 1726882636.62557: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.62571: getting variables 15794 1726882636.62573: in VariableManager get_vars() 15794 1726882636.62609: Calling all_inventory to load vars for managed_node1 15794 1726882636.62612: Calling groups_inventory to load vars for managed_node1 15794 1726882636.62615: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.62634: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.62639: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.62644: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.64114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.66005: done with get_vars() 15794 1726882636.66043: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15794 1726882636.66113: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:16 -0400 (0:00:00.098) 0:00:34.219 ****** 15794 1726882636.66140: entering _queue_task() for managed_node1/yum 15794 1726882636.66375: worker is 1 (out of 1 available) 15794 1726882636.66389: exiting _queue_task() for managed_node1/yum 15794 1726882636.66403: done queuing things up, now waiting for results queue to drain 15794 1726882636.66405: waiting for pending results... 15794 1726882636.66632: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15794 1726882636.66704: in run() - task 0affe814-3a2d-94e5-e48f-000000000042 15794 1726882636.66716: variable 'ansible_search_path' from source: unknown 15794 1726882636.66719: variable 'ansible_search_path' from source: unknown 15794 1726882636.66754: calling self._execute() 15794 1726882636.66830: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.66838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.66847: variable 'omit' from source: magic vars 15794 1726882636.67166: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.67176: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.67339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882636.69732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882636.69805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882636.69835: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882636.69870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882636.69893: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882636.70005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.70023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.70058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.70107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.70122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.70224: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.70237: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15794 1726882636.70241: when evaluation is False, skipping this task 15794 1726882636.70244: _execute() done 15794 1726882636.70247: dumping result to json 15794 1726882636.70252: done dumping result, returning 15794 1726882636.70259: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000042] 15794 1726882636.70269: sending task result for task 0affe814-3a2d-94e5-e48f-000000000042 15794 1726882636.70368: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000042 15794 1726882636.70373: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15794 1726882636.70450: no more pending results, returning what we have 15794 1726882636.70454: results queue empty 15794 1726882636.70455: checking for any_errors_fatal 15794 1726882636.70461: done checking for any_errors_fatal 15794 1726882636.70462: checking for max_fail_percentage 15794 1726882636.70464: done checking for max_fail_percentage 15794 1726882636.70465: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.70466: done checking to see if all hosts have failed 15794 1726882636.70467: getting the remaining hosts for this loop 15794 1726882636.70468: done getting the remaining hosts for this loop 15794 1726882636.70472: getting the next task for host managed_node1 15794 1726882636.70478: done getting next task for host managed_node1 15794 1726882636.70484: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15794 1726882636.70486: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.70501: getting variables 15794 1726882636.70503: in VariableManager get_vars() 15794 1726882636.70576: Calling all_inventory to load vars for managed_node1 15794 1726882636.70582: Calling groups_inventory to load vars for managed_node1 15794 1726882636.70584: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.70591: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.70594: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.70596: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.72582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.75908: done with get_vars() 15794 1726882636.75972: done getting variables 15794 1726882636.76041: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:16 -0400 (0:00:00.099) 0:00:34.318 ****** 15794 1726882636.76074: entering _queue_task() for managed_node1/fail 15794 1726882636.76642: worker is 1 (out of 1 available) 15794 1726882636.76653: exiting _queue_task() for managed_node1/fail 15794 1726882636.76664: done queuing things up, now waiting for results queue to drain 15794 1726882636.76665: waiting for pending results... 15794 1726882636.76870: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15794 1726882636.76939: in run() - task 0affe814-3a2d-94e5-e48f-000000000043 15794 1726882636.76944: variable 'ansible_search_path' from source: unknown 15794 1726882636.76947: variable 'ansible_search_path' from source: unknown 15794 1726882636.77041: calling self._execute() 15794 1726882636.77090: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.77098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.77114: variable 'omit' from source: magic vars 15794 1726882636.77561: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.77573: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.77734: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882636.78000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882636.80756: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882636.80867: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882636.80877: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882636.80928: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882636.80961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882636.81064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.81103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.81137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.81198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.81216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.81279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.81313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.81346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.81408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.81439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.81471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882636.81517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882636.81544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.81602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882636.81625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882636.81855: variable 'network_connections' from source: play vars 15794 1726882636.81939: variable 'profile' from source: play vars 15794 1726882636.81953: variable 'profile' from source: play vars 15794 1726882636.81960: variable 'interface' from source: set_fact 15794 1726882636.82038: variable 'interface' from source: set_fact 15794 1726882636.82127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882636.82352: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882636.82404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882636.82443: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882636.82480: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882636.82536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882636.82562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882636.82606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882636.82711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882636.82715: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882636.83035: variable 'network_connections' from source: play vars 15794 1726882636.83041: variable 'profile' from source: play vars 15794 1726882636.83117: variable 'profile' from source: play vars 15794 1726882636.83128: variable 'interface' from source: set_fact 15794 1726882636.83204: variable 'interface' from source: set_fact 15794 1726882636.83237: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882636.83240: when evaluation is False, skipping this task 15794 1726882636.83247: _execute() done 15794 1726882636.83250: dumping result to json 15794 1726882636.83255: done dumping result, returning 15794 1726882636.83267: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000043] 15794 1726882636.83346: sending task result for task 0affe814-3a2d-94e5-e48f-000000000043 15794 1726882636.83418: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000043 15794 1726882636.83421: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882636.83508: no more pending results, returning what we have 15794 1726882636.83512: results queue empty 15794 1726882636.83513: checking for any_errors_fatal 15794 1726882636.83520: done checking for any_errors_fatal 15794 1726882636.83521: checking for max_fail_percentage 15794 1726882636.83523: done checking for max_fail_percentage 15794 1726882636.83524: checking to see if all hosts have failed and the running result is not ok 15794 1726882636.83525: done checking to see if all hosts have failed 15794 1726882636.83526: getting the remaining hosts for this loop 15794 1726882636.83528: done getting the remaining hosts for this loop 15794 1726882636.83532: getting the next task for host managed_node1 15794 1726882636.83541: done getting next task for host managed_node1 15794 1726882636.83546: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15794 1726882636.83549: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882636.83567: getting variables 15794 1726882636.83569: in VariableManager get_vars() 15794 1726882636.83615: Calling all_inventory to load vars for managed_node1 15794 1726882636.83618: Calling groups_inventory to load vars for managed_node1 15794 1726882636.83621: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882636.83633: Calling all_plugins_play to load vars for managed_node1 15794 1726882636.83639: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882636.83644: Calling groups_plugins_play to load vars for managed_node1 15794 1726882636.86854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882636.90841: done with get_vars() 15794 1726882636.90882: done getting variables 15794 1726882636.90971: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:16 -0400 (0:00:00.149) 0:00:34.468 ****** 15794 1726882636.91018: entering _queue_task() for managed_node1/package 15794 1726882636.91430: worker is 1 (out of 1 available) 15794 1726882636.91452: exiting _queue_task() for managed_node1/package 15794 1726882636.91476: done queuing things up, now waiting for results queue to drain 15794 1726882636.91477: waiting for pending results... 15794 1726882636.91853: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15794 1726882636.91873: in run() - task 0affe814-3a2d-94e5-e48f-000000000044 15794 1726882636.92040: variable 'ansible_search_path' from source: unknown 15794 1726882636.92044: variable 'ansible_search_path' from source: unknown 15794 1726882636.92047: calling self._execute() 15794 1726882636.92061: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882636.92076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882636.92098: variable 'omit' from source: magic vars 15794 1726882636.92551: variable 'ansible_distribution_major_version' from source: facts 15794 1726882636.92571: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882636.92830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882636.93197: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882636.93322: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882636.93371: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882636.93457: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882636.94040: variable 'network_packages' from source: role '' defaults 15794 1726882636.94115: variable '__network_provider_setup' from source: role '' defaults 15794 1726882636.94255: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882636.94349: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882636.94450: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882636.94532: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882636.95008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882637.00538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882637.00629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882637.01040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882637.01048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882637.01051: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882637.01440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.01444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.01447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.01449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.01451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.01660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.01700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.01739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.01798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.01860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.02593: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15794 1726882637.02737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.02972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.03010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.03066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.03092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.03455: variable 'ansible_python' from source: facts 15794 1726882637.03492: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15794 1726882637.03602: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882637.04040: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882637.04113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.04277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.04317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.04499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.04522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.04590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.04772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.04811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.04867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.04961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.05360: variable 'network_connections' from source: play vars 15794 1726882637.05374: variable 'profile' from source: play vars 15794 1726882637.05701: variable 'profile' from source: play vars 15794 1726882637.05740: variable 'interface' from source: set_fact 15794 1726882637.05808: variable 'interface' from source: set_fact 15794 1726882637.05999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882637.06040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882637.06182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.06224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882637.06286: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882637.07139: variable 'network_connections' from source: play vars 15794 1726882637.07152: variable 'profile' from source: play vars 15794 1726882637.07467: variable 'profile' from source: play vars 15794 1726882637.07484: variable 'interface' from source: set_fact 15794 1726882637.07939: variable 'interface' from source: set_fact 15794 1726882637.07943: variable '__network_packages_default_wireless' from source: role '' defaults 15794 1726882637.07945: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882637.09144: variable 'network_connections' from source: play vars 15794 1726882637.09148: variable 'profile' from source: play vars 15794 1726882637.09340: variable 'profile' from source: play vars 15794 1726882637.09344: variable 'interface' from source: set_fact 15794 1726882637.09744: variable 'interface' from source: set_fact 15794 1726882637.09748: variable '__network_packages_default_team' from source: role '' defaults 15794 1726882637.09750: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882637.10493: variable 'network_connections' from source: play vars 15794 1726882637.10504: variable 'profile' from source: play vars 15794 1726882637.10587: variable 'profile' from source: play vars 15794 1726882637.10848: variable 'interface' from source: set_fact 15794 1726882637.10968: variable 'interface' from source: set_fact 15794 1726882637.11042: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882637.11316: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882637.11332: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882637.11414: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882637.11904: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15794 1726882637.13337: variable 'network_connections' from source: play vars 15794 1726882637.13450: variable 'profile' from source: play vars 15794 1726882637.13538: variable 'profile' from source: play vars 15794 1726882637.13652: variable 'interface' from source: set_fact 15794 1726882637.13842: variable 'interface' from source: set_fact 15794 1726882637.13861: variable 'ansible_distribution' from source: facts 15794 1726882637.13873: variable '__network_rh_distros' from source: role '' defaults 15794 1726882637.13891: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.13916: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15794 1726882637.14461: variable 'ansible_distribution' from source: facts 15794 1726882637.14471: variable '__network_rh_distros' from source: role '' defaults 15794 1726882637.14485: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.14497: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15794 1726882637.14821: variable 'ansible_distribution' from source: facts 15794 1726882637.15139: variable '__network_rh_distros' from source: role '' defaults 15794 1726882637.15142: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.15145: variable 'network_provider' from source: set_fact 15794 1726882637.15147: variable 'ansible_facts' from source: unknown 15794 1726882637.17867: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15794 1726882637.18053: when evaluation is False, skipping this task 15794 1726882637.18069: _execute() done 15794 1726882637.18084: dumping result to json 15794 1726882637.18098: done dumping result, returning 15794 1726882637.18116: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-94e5-e48f-000000000044] 15794 1726882637.18137: sending task result for task 0affe814-3a2d-94e5-e48f-000000000044 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15794 1726882637.18395: no more pending results, returning what we have 15794 1726882637.18400: results queue empty 15794 1726882637.18401: checking for any_errors_fatal 15794 1726882637.18489: done checking for any_errors_fatal 15794 1726882637.18490: checking for max_fail_percentage 15794 1726882637.18493: done checking for max_fail_percentage 15794 1726882637.18494: checking to see if all hosts have failed and the running result is not ok 15794 1726882637.18495: done checking to see if all hosts have failed 15794 1726882637.18496: getting the remaining hosts for this loop 15794 1726882637.18499: done getting the remaining hosts for this loop 15794 1726882637.18508: getting the next task for host managed_node1 15794 1726882637.18660: done getting next task for host managed_node1 15794 1726882637.18666: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15794 1726882637.18669: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882637.18742: getting variables 15794 1726882637.18746: in VariableManager get_vars() 15794 1726882637.18966: Calling all_inventory to load vars for managed_node1 15794 1726882637.18971: Calling groups_inventory to load vars for managed_node1 15794 1726882637.18975: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882637.18989: Calling all_plugins_play to load vars for managed_node1 15794 1726882637.18998: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882637.19003: Calling groups_plugins_play to load vars for managed_node1 15794 1726882637.19526: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000044 15794 1726882637.19535: WORKER PROCESS EXITING 15794 1726882637.26876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882637.35502: done with get_vars() 15794 1726882637.35662: done getting variables 15794 1726882637.35864: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:17 -0400 (0:00:00.448) 0:00:34.916 ****** 15794 1726882637.35904: entering _queue_task() for managed_node1/package 15794 1726882637.36956: worker is 1 (out of 1 available) 15794 1726882637.36968: exiting _queue_task() for managed_node1/package 15794 1726882637.36983: done queuing things up, now waiting for results queue to drain 15794 1726882637.36984: waiting for pending results... 15794 1726882637.37497: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15794 1726882637.37751: in run() - task 0affe814-3a2d-94e5-e48f-000000000045 15794 1726882637.37768: variable 'ansible_search_path' from source: unknown 15794 1726882637.37773: variable 'ansible_search_path' from source: unknown 15794 1726882637.37940: calling self._execute() 15794 1726882637.38073: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882637.38081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882637.38098: variable 'omit' from source: magic vars 15794 1726882637.38789: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.38793: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882637.38970: variable 'network_state' from source: role '' defaults 15794 1726882637.39041: Evaluated conditional (network_state != {}): False 15794 1726882637.39045: when evaluation is False, skipping this task 15794 1726882637.39048: _execute() done 15794 1726882637.39117: dumping result to json 15794 1726882637.39121: done dumping result, returning 15794 1726882637.39130: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-94e5-e48f-000000000045] 15794 1726882637.39151: sending task result for task 0affe814-3a2d-94e5-e48f-000000000045 15794 1726882637.39271: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000045 15794 1726882637.39275: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882637.39340: no more pending results, returning what we have 15794 1726882637.39345: results queue empty 15794 1726882637.39346: checking for any_errors_fatal 15794 1726882637.39354: done checking for any_errors_fatal 15794 1726882637.39355: checking for max_fail_percentage 15794 1726882637.39358: done checking for max_fail_percentage 15794 1726882637.39359: checking to see if all hosts have failed and the running result is not ok 15794 1726882637.39360: done checking to see if all hosts have failed 15794 1726882637.39361: getting the remaining hosts for this loop 15794 1726882637.39364: done getting the remaining hosts for this loop 15794 1726882637.39369: getting the next task for host managed_node1 15794 1726882637.39377: done getting next task for host managed_node1 15794 1726882637.39385: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15794 1726882637.39389: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882637.39408: getting variables 15794 1726882637.39410: in VariableManager get_vars() 15794 1726882637.39689: Calling all_inventory to load vars for managed_node1 15794 1726882637.39694: Calling groups_inventory to load vars for managed_node1 15794 1726882637.39697: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882637.39712: Calling all_plugins_play to load vars for managed_node1 15794 1726882637.39716: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882637.39942: Calling groups_plugins_play to load vars for managed_node1 15794 1726882637.44191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882637.48742: done with get_vars() 15794 1726882637.48897: done getting variables 15794 1726882637.48969: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:17 -0400 (0:00:00.131) 0:00:35.048 ****** 15794 1726882637.49064: entering _queue_task() for managed_node1/package 15794 1726882637.49807: worker is 1 (out of 1 available) 15794 1726882637.49821: exiting _queue_task() for managed_node1/package 15794 1726882637.49836: done queuing things up, now waiting for results queue to drain 15794 1726882637.49967: waiting for pending results... 15794 1726882637.50554: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15794 1726882637.50761: in run() - task 0affe814-3a2d-94e5-e48f-000000000046 15794 1726882637.50764: variable 'ansible_search_path' from source: unknown 15794 1726882637.50767: variable 'ansible_search_path' from source: unknown 15794 1726882637.50770: calling self._execute() 15794 1726882637.50903: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882637.50987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882637.51005: variable 'omit' from source: magic vars 15794 1726882637.51900: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.51967: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882637.52384: variable 'network_state' from source: role '' defaults 15794 1726882637.52388: Evaluated conditional (network_state != {}): False 15794 1726882637.52391: when evaluation is False, skipping this task 15794 1726882637.52393: _execute() done 15794 1726882637.52395: dumping result to json 15794 1726882637.52398: done dumping result, returning 15794 1726882637.52401: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-94e5-e48f-000000000046] 15794 1726882637.52403: sending task result for task 0affe814-3a2d-94e5-e48f-000000000046 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882637.52649: no more pending results, returning what we have 15794 1726882637.52654: results queue empty 15794 1726882637.52655: checking for any_errors_fatal 15794 1726882637.52665: done checking for any_errors_fatal 15794 1726882637.52666: checking for max_fail_percentage 15794 1726882637.52668: done checking for max_fail_percentage 15794 1726882637.52669: checking to see if all hosts have failed and the running result is not ok 15794 1726882637.52670: done checking to see if all hosts have failed 15794 1726882637.52671: getting the remaining hosts for this loop 15794 1726882637.52673: done getting the remaining hosts for this loop 15794 1726882637.52678: getting the next task for host managed_node1 15794 1726882637.52688: done getting next task for host managed_node1 15794 1726882637.52694: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15794 1726882637.52696: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882637.52714: getting variables 15794 1726882637.52716: in VariableManager get_vars() 15794 1726882637.52764: Calling all_inventory to load vars for managed_node1 15794 1726882637.52767: Calling groups_inventory to load vars for managed_node1 15794 1726882637.52770: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882637.52786: Calling all_plugins_play to load vars for managed_node1 15794 1726882637.52790: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882637.52794: Calling groups_plugins_play to load vars for managed_node1 15794 1726882637.53358: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000046 15794 1726882637.53362: WORKER PROCESS EXITING 15794 1726882637.56704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882637.59176: done with get_vars() 15794 1726882637.59202: done getting variables 15794 1726882637.59277: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:17 -0400 (0:00:00.102) 0:00:35.151 ****** 15794 1726882637.59316: entering _queue_task() for managed_node1/service 15794 1726882637.59674: worker is 1 (out of 1 available) 15794 1726882637.59691: exiting _queue_task() for managed_node1/service 15794 1726882637.59707: done queuing things up, now waiting for results queue to drain 15794 1726882637.59708: waiting for pending results... 15794 1726882637.60020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15794 1726882637.60147: in run() - task 0affe814-3a2d-94e5-e48f-000000000047 15794 1726882637.60168: variable 'ansible_search_path' from source: unknown 15794 1726882637.60173: variable 'ansible_search_path' from source: unknown 15794 1726882637.60212: calling self._execute() 15794 1726882637.60321: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882637.60357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882637.60361: variable 'omit' from source: magic vars 15794 1726882637.60814: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.60841: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882637.60984: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882637.61250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882637.64238: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882637.64243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882637.64411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882637.64462: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882637.64564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882637.64851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.64895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.64979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.65109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.65138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.65307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.65344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.65414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.65472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.65531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.65586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.65626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.65661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.65724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.65743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.66109: variable 'network_connections' from source: play vars 15794 1726882637.66123: variable 'profile' from source: play vars 15794 1726882637.66531: variable 'profile' from source: play vars 15794 1726882637.66537: variable 'interface' from source: set_fact 15794 1726882637.66707: variable 'interface' from source: set_fact 15794 1726882637.67062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882637.67542: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882637.67546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882637.67549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882637.67586: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882637.67644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882637.67712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882637.67725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.67753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882637.67826: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882637.68417: variable 'network_connections' from source: play vars 15794 1726882637.68421: variable 'profile' from source: play vars 15794 1726882637.68424: variable 'profile' from source: play vars 15794 1726882637.68525: variable 'interface' from source: set_fact 15794 1726882637.68529: variable 'interface' from source: set_fact 15794 1726882637.68531: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882637.68533: when evaluation is False, skipping this task 15794 1726882637.68540: _execute() done 15794 1726882637.68542: dumping result to json 15794 1726882637.68544: done dumping result, returning 15794 1726882637.68546: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000047] 15794 1726882637.68555: sending task result for task 0affe814-3a2d-94e5-e48f-000000000047 15794 1726882637.68623: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000047 15794 1726882637.68626: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882637.68709: no more pending results, returning what we have 15794 1726882637.68714: results queue empty 15794 1726882637.68715: checking for any_errors_fatal 15794 1726882637.68721: done checking for any_errors_fatal 15794 1726882637.68722: checking for max_fail_percentage 15794 1726882637.68725: done checking for max_fail_percentage 15794 1726882637.68726: checking to see if all hosts have failed and the running result is not ok 15794 1726882637.68727: done checking to see if all hosts have failed 15794 1726882637.68728: getting the remaining hosts for this loop 15794 1726882637.68730: done getting the remaining hosts for this loop 15794 1726882637.68740: getting the next task for host managed_node1 15794 1726882637.68749: done getting next task for host managed_node1 15794 1726882637.68753: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15794 1726882637.68756: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882637.68774: getting variables 15794 1726882637.68776: in VariableManager get_vars() 15794 1726882637.68826: Calling all_inventory to load vars for managed_node1 15794 1726882637.68830: Calling groups_inventory to load vars for managed_node1 15794 1726882637.68833: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882637.68948: Calling all_plugins_play to load vars for managed_node1 15794 1726882637.68953: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882637.68964: Calling groups_plugins_play to load vars for managed_node1 15794 1726882637.73632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882637.77962: done with get_vars() 15794 1726882637.78075: done getting variables 15794 1726882637.78214: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:17 -0400 (0:00:00.189) 0:00:35.341 ****** 15794 1726882637.78317: entering _queue_task() for managed_node1/service 15794 1726882637.79051: worker is 1 (out of 1 available) 15794 1726882637.79076: exiting _queue_task() for managed_node1/service 15794 1726882637.79090: done queuing things up, now waiting for results queue to drain 15794 1726882637.79092: waiting for pending results... 15794 1726882637.79409: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15794 1726882637.79545: in run() - task 0affe814-3a2d-94e5-e48f-000000000048 15794 1726882637.79562: variable 'ansible_search_path' from source: unknown 15794 1726882637.79566: variable 'ansible_search_path' from source: unknown 15794 1726882637.79699: calling self._execute() 15794 1726882637.79726: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882637.79736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882637.79798: variable 'omit' from source: magic vars 15794 1726882637.80343: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.80347: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882637.80471: variable 'network_provider' from source: set_fact 15794 1726882637.80476: variable 'network_state' from source: role '' defaults 15794 1726882637.80481: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15794 1726882637.80484: variable 'omit' from source: magic vars 15794 1726882637.80542: variable 'omit' from source: magic vars 15794 1726882637.80575: variable 'network_service_name' from source: role '' defaults 15794 1726882637.80740: variable 'network_service_name' from source: role '' defaults 15794 1726882637.80816: variable '__network_provider_setup' from source: role '' defaults 15794 1726882637.80823: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882637.80902: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882637.80915: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882637.80997: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882637.81352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882637.84429: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882637.84559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882637.84623: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882637.84718: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882637.84722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882637.84854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.84900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.84940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.85015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.85034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.85108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.85173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.85303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.85307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.85310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.85769: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15794 1726882637.85970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.85998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.86028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.86127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.86130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.86236: variable 'ansible_python' from source: facts 15794 1726882637.86253: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15794 1726882637.86366: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882637.86658: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882637.86674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.86702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.86731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.86790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.86807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.86894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882637.86941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882637.86981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.87030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882637.87052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882637.87355: variable 'network_connections' from source: play vars 15794 1726882637.87359: variable 'profile' from source: play vars 15794 1726882637.87362: variable 'profile' from source: play vars 15794 1726882637.87364: variable 'interface' from source: set_fact 15794 1726882637.87444: variable 'interface' from source: set_fact 15794 1726882637.87586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882637.87860: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882637.87899: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882637.87954: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882637.88008: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882637.88091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882637.88126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882637.88167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882637.88297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882637.88300: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882637.88697: variable 'network_connections' from source: play vars 15794 1726882637.88700: variable 'profile' from source: play vars 15794 1726882637.88797: variable 'profile' from source: play vars 15794 1726882637.88802: variable 'interface' from source: set_fact 15794 1726882637.88920: variable 'interface' from source: set_fact 15794 1726882637.88923: variable '__network_packages_default_wireless' from source: role '' defaults 15794 1726882637.89138: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882637.89448: variable 'network_connections' from source: play vars 15794 1726882637.89455: variable 'profile' from source: play vars 15794 1726882637.89550: variable 'profile' from source: play vars 15794 1726882637.89571: variable 'interface' from source: set_fact 15794 1726882637.89792: variable 'interface' from source: set_fact 15794 1726882637.89796: variable '__network_packages_default_team' from source: role '' defaults 15794 1726882637.89809: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882637.90593: variable 'network_connections' from source: play vars 15794 1726882637.90645: variable 'profile' from source: play vars 15794 1726882637.90786: variable 'profile' from source: play vars 15794 1726882637.90840: variable 'interface' from source: set_fact 15794 1726882637.90999: variable 'interface' from source: set_fact 15794 1726882637.91245: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882637.91643: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882637.91646: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882637.91649: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882637.91945: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15794 1726882637.93402: variable 'network_connections' from source: play vars 15794 1726882637.93408: variable 'profile' from source: play vars 15794 1726882637.93506: variable 'profile' from source: play vars 15794 1726882637.93510: variable 'interface' from source: set_fact 15794 1726882637.93611: variable 'interface' from source: set_fact 15794 1726882637.93622: variable 'ansible_distribution' from source: facts 15794 1726882637.93625: variable '__network_rh_distros' from source: role '' defaults 15794 1726882637.93635: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.93666: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15794 1726882637.94013: variable 'ansible_distribution' from source: facts 15794 1726882637.94017: variable '__network_rh_distros' from source: role '' defaults 15794 1726882637.94020: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.94022: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15794 1726882637.94244: variable 'ansible_distribution' from source: facts 15794 1726882637.94248: variable '__network_rh_distros' from source: role '' defaults 15794 1726882637.94255: variable 'ansible_distribution_major_version' from source: facts 15794 1726882637.94309: variable 'network_provider' from source: set_fact 15794 1726882637.94341: variable 'omit' from source: magic vars 15794 1726882637.94377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882637.94422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882637.94444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882637.94507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882637.94543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882637.94669: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882637.94673: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882637.94675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882637.94817: Set connection var ansible_connection to ssh 15794 1726882637.94831: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882637.94841: Set connection var ansible_pipelining to False 15794 1726882637.94915: Set connection var ansible_shell_executable to /bin/sh 15794 1726882637.94920: Set connection var ansible_shell_type to sh 15794 1726882637.94933: Set connection var ansible_timeout to 10 15794 1726882637.95078: variable 'ansible_shell_executable' from source: unknown 15794 1726882637.95081: variable 'ansible_connection' from source: unknown 15794 1726882637.95085: variable 'ansible_module_compression' from source: unknown 15794 1726882637.95087: variable 'ansible_shell_type' from source: unknown 15794 1726882637.95090: variable 'ansible_shell_executable' from source: unknown 15794 1726882637.95092: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882637.95100: variable 'ansible_pipelining' from source: unknown 15794 1726882637.95102: variable 'ansible_timeout' from source: unknown 15794 1726882637.95104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882637.95451: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882637.95455: variable 'omit' from source: magic vars 15794 1726882637.95458: starting attempt loop 15794 1726882637.95460: running the handler 15794 1726882637.95463: variable 'ansible_facts' from source: unknown 15794 1726882637.96749: _low_level_execute_command(): starting 15794 1726882637.96758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882637.97791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882637.97802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882637.97811: stderr chunk (state=3): >>>debug2: match found <<< 15794 1726882637.97889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882637.97920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882637.97955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882637.97965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882637.98128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882637.99913: stdout chunk (state=3): >>>/root <<< 15794 1726882638.00152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882638.00208: stderr chunk (state=3): >>><<< 15794 1726882638.00307: stdout chunk (state=3): >>><<< 15794 1726882638.00311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882638.00314: _low_level_execute_command(): starting 15794 1726882638.00318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597 `" && echo ansible-tmp-1726882638.0029342-17092-275571880395597="` echo /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597 `" ) && sleep 0' 15794 1726882638.02396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882638.02404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882638.02522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882638.02526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882638.02616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882638.02778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882638.02781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882638.02881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882638.03172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882638.05001: stdout chunk (state=3): >>>ansible-tmp-1726882638.0029342-17092-275571880395597=/root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597 <<< 15794 1726882638.05159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882638.05291: stderr chunk (state=3): >>><<< 15794 1726882638.05295: stdout chunk (state=3): >>><<< 15794 1726882638.05313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882638.0029342-17092-275571880395597=/root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882638.05558: variable 'ansible_module_compression' from source: unknown 15794 1726882638.05562: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15794 1726882638.05564: variable 'ansible_facts' from source: unknown 15794 1726882638.05896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py 15794 1726882638.06273: Sending initial data 15794 1726882638.06377: Sent initial data (156 bytes) 15794 1726882638.08324: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882638.08415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882638.08663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882638.08681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882638.08841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882638.08931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882638.10606: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882638.10717: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882638.10863: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp2h7gvvcc /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py <<< 15794 1726882638.10942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py" <<< 15794 1726882638.11123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp2h7gvvcc" to remote "/root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py" <<< 15794 1726882638.15615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882638.15630: stderr chunk (state=3): >>><<< 15794 1726882638.15667: stdout chunk (state=3): >>><<< 15794 1726882638.15773: done transferring module to remote 15794 1726882638.15839: _low_level_execute_command(): starting 15794 1726882638.15843: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/ /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py && sleep 0' 15794 1726882638.16318: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882638.16321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882638.16328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882638.16331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882638.16336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882638.16400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882638.16406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882638.16408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882638.16461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882638.18655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882638.18662: stdout chunk (state=3): >>><<< 15794 1726882638.18666: stderr chunk (state=3): >>><<< 15794 1726882638.18672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882638.18675: _low_level_execute_command(): starting 15794 1726882638.18677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/AnsiballZ_systemd.py && sleep 0' 15794 1726882638.19321: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882638.19354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882638.19394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882638.19416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882638.19509: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882638.19575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882638.19605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882638.19701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882638.19865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882638.52345: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "652", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ExecMainStartTimestampMonotonic": "15833159", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "652", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_ti<<< 15794 1726882638.52376: stdout chunk (state=3): >>>me=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "1268220000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "network-pre.target system.slice systemd-journald.socket dbus.socket sysinit.target dbus-broker.service cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:32 EDT", "StateChangeTimestampMonotonic": "366878571", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:41 EDT", "InactiveExitTimestampMonotonic": "15833421", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ActiveEnterTimestampMonotonic": "15948855", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ConditionTimestampMonotonic": "15822215", "AssertTimestamp": "Fri 2024-09-20 21:27:41 EDT", "AssertTimestampMonotonic": "15822218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "9d67906d6bf74ff48c21207bf47afee4", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15794 1726882638.54320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882638.54490: stderr chunk (state=3): >>><<< 15794 1726882638.54494: stdout chunk (state=3): >>><<< 15794 1726882638.54727: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "652", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ExecMainStartTimestampMonotonic": "15833159", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "652", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "1268220000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "network-pre.target system.slice systemd-journald.socket dbus.socket sysinit.target dbus-broker.service cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:32 EDT", "StateChangeTimestampMonotonic": "366878571", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:41 EDT", "InactiveExitTimestampMonotonic": "15833421", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ActiveEnterTimestampMonotonic": "15948855", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ConditionTimestampMonotonic": "15822215", "AssertTimestamp": "Fri 2024-09-20 21:27:41 EDT", "AssertTimestampMonotonic": "15822218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "9d67906d6bf74ff48c21207bf47afee4", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882638.55351: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882638.55408: _low_level_execute_command(): starting 15794 1726882638.55447: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882638.0029342-17092-275571880395597/ > /dev/null 2>&1 && sleep 0' 15794 1726882638.56669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882638.56953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882638.56962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882638.57014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882638.57061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882638.58965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882638.59038: stderr chunk (state=3): >>><<< 15794 1726882638.59049: stdout chunk (state=3): >>><<< 15794 1726882638.59063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882638.59073: handler run complete 15794 1726882638.59157: attempt loop complete, returning result 15794 1726882638.59161: _execute() done 15794 1726882638.59164: dumping result to json 15794 1726882638.59187: done dumping result, returning 15794 1726882638.59339: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-94e5-e48f-000000000048] 15794 1726882638.59343: sending task result for task 0affe814-3a2d-94e5-e48f-000000000048 15794 1726882638.59744: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000048 15794 1726882638.59747: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882638.59799: no more pending results, returning what we have 15794 1726882638.59803: results queue empty 15794 1726882638.59804: checking for any_errors_fatal 15794 1726882638.59808: done checking for any_errors_fatal 15794 1726882638.59809: checking for max_fail_percentage 15794 1726882638.59811: done checking for max_fail_percentage 15794 1726882638.59812: checking to see if all hosts have failed and the running result is not ok 15794 1726882638.59813: done checking to see if all hosts have failed 15794 1726882638.59814: getting the remaining hosts for this loop 15794 1726882638.59816: done getting the remaining hosts for this loop 15794 1726882638.59819: getting the next task for host managed_node1 15794 1726882638.59825: done getting next task for host managed_node1 15794 1726882638.59829: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15794 1726882638.59831: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882638.59851: getting variables 15794 1726882638.59853: in VariableManager get_vars() 15794 1726882638.59893: Calling all_inventory to load vars for managed_node1 15794 1726882638.59897: Calling groups_inventory to load vars for managed_node1 15794 1726882638.59900: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882638.59912: Calling all_plugins_play to load vars for managed_node1 15794 1726882638.59916: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882638.59920: Calling groups_plugins_play to load vars for managed_node1 15794 1726882638.64805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882638.68311: done with get_vars() 15794 1726882638.68366: done getting variables 15794 1726882638.68622: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:18 -0400 (0:00:00.903) 0:00:36.244 ****** 15794 1726882638.68700: entering _queue_task() for managed_node1/service 15794 1726882638.69253: worker is 1 (out of 1 available) 15794 1726882638.69266: exiting _queue_task() for managed_node1/service 15794 1726882638.69277: done queuing things up, now waiting for results queue to drain 15794 1726882638.69321: waiting for pending results... 15794 1726882638.69628: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15794 1726882638.69726: in run() - task 0affe814-3a2d-94e5-e48f-000000000049 15794 1726882638.69749: variable 'ansible_search_path' from source: unknown 15794 1726882638.69764: variable 'ansible_search_path' from source: unknown 15794 1726882638.69812: calling self._execute() 15794 1726882638.70001: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882638.70016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882638.70033: variable 'omit' from source: magic vars 15794 1726882638.70638: variable 'ansible_distribution_major_version' from source: facts 15794 1726882638.70662: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882638.70841: variable 'network_provider' from source: set_fact 15794 1726882638.70858: Evaluated conditional (network_provider == "nm"): True 15794 1726882638.71036: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882638.71130: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882638.71393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882638.74528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882638.74636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882638.74829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882638.74950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882638.74981: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882638.75204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882638.75248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882638.75284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882638.75339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882638.75359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882638.75416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882638.75497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882638.75501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882638.75532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882638.75560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882638.75627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882638.75682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882638.75770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882638.75916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882638.75967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882638.76402: variable 'network_connections' from source: play vars 15794 1726882638.76440: variable 'profile' from source: play vars 15794 1726882638.76589: variable 'profile' from source: play vars 15794 1726882638.76593: variable 'interface' from source: set_fact 15794 1726882638.76829: variable 'interface' from source: set_fact 15794 1726882638.76838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882638.77335: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882638.77424: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882638.77464: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882638.77505: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882638.77575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882638.77601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882638.77960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882638.77964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882638.77968: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882638.78621: variable 'network_connections' from source: play vars 15794 1726882638.78636: variable 'profile' from source: play vars 15794 1726882638.78752: variable 'profile' from source: play vars 15794 1726882638.78786: variable 'interface' from source: set_fact 15794 1726882638.78871: variable 'interface' from source: set_fact 15794 1726882638.78967: Evaluated conditional (__network_wpa_supplicant_required): False 15794 1726882638.78973: when evaluation is False, skipping this task 15794 1726882638.78976: _execute() done 15794 1726882638.79146: dumping result to json 15794 1726882638.79149: done dumping result, returning 15794 1726882638.79152: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-94e5-e48f-000000000049] 15794 1726882638.79154: sending task result for task 0affe814-3a2d-94e5-e48f-000000000049 15794 1726882638.79225: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000049 15794 1726882638.79228: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15794 1726882638.79288: no more pending results, returning what we have 15794 1726882638.79292: results queue empty 15794 1726882638.79294: checking for any_errors_fatal 15794 1726882638.79320: done checking for any_errors_fatal 15794 1726882638.79321: checking for max_fail_percentage 15794 1726882638.79323: done checking for max_fail_percentage 15794 1726882638.79324: checking to see if all hosts have failed and the running result is not ok 15794 1726882638.79325: done checking to see if all hosts have failed 15794 1726882638.79326: getting the remaining hosts for this loop 15794 1726882638.79327: done getting the remaining hosts for this loop 15794 1726882638.79331: getting the next task for host managed_node1 15794 1726882638.79339: done getting next task for host managed_node1 15794 1726882638.79344: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15794 1726882638.79346: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882638.79364: getting variables 15794 1726882638.79367: in VariableManager get_vars() 15794 1726882638.79411: Calling all_inventory to load vars for managed_node1 15794 1726882638.79414: Calling groups_inventory to load vars for managed_node1 15794 1726882638.79417: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882638.79427: Calling all_plugins_play to load vars for managed_node1 15794 1726882638.79430: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882638.79433: Calling groups_plugins_play to load vars for managed_node1 15794 1726882638.82606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882638.85600: done with get_vars() 15794 1726882638.85636: done getting variables 15794 1726882638.85710: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:18 -0400 (0:00:00.170) 0:00:36.415 ****** 15794 1726882638.85747: entering _queue_task() for managed_node1/service 15794 1726882638.86102: worker is 1 (out of 1 available) 15794 1726882638.86118: exiting _queue_task() for managed_node1/service 15794 1726882638.86130: done queuing things up, now waiting for results queue to drain 15794 1726882638.86132: waiting for pending results... 15794 1726882638.86640: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15794 1726882638.86650: in run() - task 0affe814-3a2d-94e5-e48f-00000000004a 15794 1726882638.86654: variable 'ansible_search_path' from source: unknown 15794 1726882638.86658: variable 'ansible_search_path' from source: unknown 15794 1726882638.86661: calling self._execute() 15794 1726882638.86747: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882638.86751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882638.86794: variable 'omit' from source: magic vars 15794 1726882638.87650: variable 'ansible_distribution_major_version' from source: facts 15794 1726882638.87654: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882638.87825: variable 'network_provider' from source: set_fact 15794 1726882638.87857: Evaluated conditional (network_provider == "initscripts"): False 15794 1726882638.87862: when evaluation is False, skipping this task 15794 1726882638.87865: _execute() done 15794 1726882638.87868: dumping result to json 15794 1726882638.87871: done dumping result, returning 15794 1726882638.87942: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-94e5-e48f-00000000004a] 15794 1726882638.87949: sending task result for task 0affe814-3a2d-94e5-e48f-00000000004a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882638.88224: no more pending results, returning what we have 15794 1726882638.88228: results queue empty 15794 1726882638.88230: checking for any_errors_fatal 15794 1726882638.88244: done checking for any_errors_fatal 15794 1726882638.88246: checking for max_fail_percentage 15794 1726882638.88248: done checking for max_fail_percentage 15794 1726882638.88249: checking to see if all hosts have failed and the running result is not ok 15794 1726882638.88250: done checking to see if all hosts have failed 15794 1726882638.88251: getting the remaining hosts for this loop 15794 1726882638.88253: done getting the remaining hosts for this loop 15794 1726882638.88258: getting the next task for host managed_node1 15794 1726882638.88267: done getting next task for host managed_node1 15794 1726882638.88271: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15794 1726882638.88275: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882638.88294: getting variables 15794 1726882638.88297: in VariableManager get_vars() 15794 1726882638.88589: Calling all_inventory to load vars for managed_node1 15794 1726882638.88592: Calling groups_inventory to load vars for managed_node1 15794 1726882638.88595: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882638.88606: Calling all_plugins_play to load vars for managed_node1 15794 1726882638.88609: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882638.88613: Calling groups_plugins_play to load vars for managed_node1 15794 1726882638.89237: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000004a 15794 1726882638.89243: WORKER PROCESS EXITING 15794 1726882638.91723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882638.95109: done with get_vars() 15794 1726882638.95156: done getting variables 15794 1726882638.95253: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:18 -0400 (0:00:00.095) 0:00:36.511 ****** 15794 1726882638.95338: entering _queue_task() for managed_node1/copy 15794 1726882638.95826: worker is 1 (out of 1 available) 15794 1726882638.95844: exiting _queue_task() for managed_node1/copy 15794 1726882638.95864: done queuing things up, now waiting for results queue to drain 15794 1726882638.95866: waiting for pending results... 15794 1726882638.96260: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15794 1726882638.96815: in run() - task 0affe814-3a2d-94e5-e48f-00000000004b 15794 1726882638.96819: variable 'ansible_search_path' from source: unknown 15794 1726882638.96822: variable 'ansible_search_path' from source: unknown 15794 1726882638.96824: calling self._execute() 15794 1726882638.96905: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882638.96922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882638.96926: variable 'omit' from source: magic vars 15794 1726882638.97344: variable 'ansible_distribution_major_version' from source: facts 15794 1726882638.97363: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882638.97502: variable 'network_provider' from source: set_fact 15794 1726882638.97540: Evaluated conditional (network_provider == "initscripts"): False 15794 1726882638.97544: when evaluation is False, skipping this task 15794 1726882638.97547: _execute() done 15794 1726882638.97550: dumping result to json 15794 1726882638.97552: done dumping result, returning 15794 1726882638.97556: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-94e5-e48f-00000000004b] 15794 1726882638.97558: sending task result for task 0affe814-3a2d-94e5-e48f-00000000004b 15794 1726882638.97755: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000004b 15794 1726882638.97758: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15794 1726882638.97822: no more pending results, returning what we have 15794 1726882638.97825: results queue empty 15794 1726882638.97826: checking for any_errors_fatal 15794 1726882638.97831: done checking for any_errors_fatal 15794 1726882638.97833: checking for max_fail_percentage 15794 1726882638.97836: done checking for max_fail_percentage 15794 1726882638.97837: checking to see if all hosts have failed and the running result is not ok 15794 1726882638.97838: done checking to see if all hosts have failed 15794 1726882638.97839: getting the remaining hosts for this loop 15794 1726882638.97841: done getting the remaining hosts for this loop 15794 1726882638.97844: getting the next task for host managed_node1 15794 1726882638.97850: done getting next task for host managed_node1 15794 1726882638.97855: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15794 1726882638.97857: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882638.97881: getting variables 15794 1726882638.97883: in VariableManager get_vars() 15794 1726882638.97919: Calling all_inventory to load vars for managed_node1 15794 1726882638.97922: Calling groups_inventory to load vars for managed_node1 15794 1726882638.97926: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882638.97939: Calling all_plugins_play to load vars for managed_node1 15794 1726882638.97942: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882638.97947: Calling groups_plugins_play to load vars for managed_node1 15794 1726882639.00733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882639.04466: done with get_vars() 15794 1726882639.04505: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:19 -0400 (0:00:00.092) 0:00:36.604 ****** 15794 1726882639.04619: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15794 1726882639.05155: worker is 1 (out of 1 available) 15794 1726882639.05166: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15794 1726882639.05178: done queuing things up, now waiting for results queue to drain 15794 1726882639.05181: waiting for pending results... 15794 1726882639.05551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15794 1726882639.05560: in run() - task 0affe814-3a2d-94e5-e48f-00000000004c 15794 1726882639.05587: variable 'ansible_search_path' from source: unknown 15794 1726882639.05598: variable 'ansible_search_path' from source: unknown 15794 1726882639.05682: calling self._execute() 15794 1726882639.05866: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882639.05881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882639.05896: variable 'omit' from source: magic vars 15794 1726882639.06523: variable 'ansible_distribution_major_version' from source: facts 15794 1726882639.06606: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882639.06614: variable 'omit' from source: magic vars 15794 1726882639.06636: variable 'omit' from source: magic vars 15794 1726882639.06876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882639.10548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882639.10670: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882639.10776: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882639.10790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882639.10897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882639.10988: variable 'network_provider' from source: set_fact 15794 1726882639.11205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882639.11278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882639.11319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882639.11392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882639.11416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882639.11525: variable 'omit' from source: magic vars 15794 1726882639.11714: variable 'omit' from source: magic vars 15794 1726882639.11872: variable 'network_connections' from source: play vars 15794 1726882639.11900: variable 'profile' from source: play vars 15794 1726882639.12000: variable 'profile' from source: play vars 15794 1726882639.12031: variable 'interface' from source: set_fact 15794 1726882639.12108: variable 'interface' from source: set_fact 15794 1726882639.12360: variable 'omit' from source: magic vars 15794 1726882639.12363: variable '__lsr_ansible_managed' from source: task vars 15794 1726882639.12475: variable '__lsr_ansible_managed' from source: task vars 15794 1726882639.12941: Loaded config def from plugin (lookup/template) 15794 1726882639.12968: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15794 1726882639.12996: File lookup term: get_ansible_managed.j2 15794 1726882639.13041: variable 'ansible_search_path' from source: unknown 15794 1726882639.13047: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15794 1726882639.13054: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15794 1726882639.13063: variable 'ansible_search_path' from source: unknown 15794 1726882639.29031: variable 'ansible_managed' from source: unknown 15794 1726882639.29410: variable 'omit' from source: magic vars 15794 1726882639.29570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882639.29576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882639.29579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882639.29582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882639.29585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882639.29788: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882639.29792: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882639.29794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882639.30006: Set connection var ansible_connection to ssh 15794 1726882639.30010: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882639.30013: Set connection var ansible_pipelining to False 15794 1726882639.30015: Set connection var ansible_shell_executable to /bin/sh 15794 1726882639.30017: Set connection var ansible_shell_type to sh 15794 1726882639.30019: Set connection var ansible_timeout to 10 15794 1726882639.30118: variable 'ansible_shell_executable' from source: unknown 15794 1726882639.30121: variable 'ansible_connection' from source: unknown 15794 1726882639.30124: variable 'ansible_module_compression' from source: unknown 15794 1726882639.30129: variable 'ansible_shell_type' from source: unknown 15794 1726882639.30137: variable 'ansible_shell_executable' from source: unknown 15794 1726882639.30139: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882639.30144: variable 'ansible_pipelining' from source: unknown 15794 1726882639.30705: variable 'ansible_timeout' from source: unknown 15794 1726882639.30708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882639.30711: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882639.30721: variable 'omit' from source: magic vars 15794 1726882639.30723: starting attempt loop 15794 1726882639.30726: running the handler 15794 1726882639.30728: _low_level_execute_command(): starting 15794 1726882639.30730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882639.32854: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882639.32925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882639.32941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882639.33156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882639.33351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882639.35017: stdout chunk (state=3): >>>/root <<< 15794 1726882639.35196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882639.35203: stdout chunk (state=3): >>><<< 15794 1726882639.35212: stderr chunk (state=3): >>><<< 15794 1726882639.35243: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882639.35257: _low_level_execute_command(): starting 15794 1726882639.35261: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705 `" && echo ansible-tmp-1726882639.352397-17133-63775666916705="` echo /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705 `" ) && sleep 0' 15794 1726882639.36104: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882639.36115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882639.36119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882639.36132: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882639.36249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882639.36253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882639.36356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882639.38848: stdout chunk (state=3): >>>ansible-tmp-1726882639.352397-17133-63775666916705=/root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705 <<< 15794 1726882639.38852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882639.38855: stdout chunk (state=3): >>><<< 15794 1726882639.38857: stderr chunk (state=3): >>><<< 15794 1726882639.38860: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882639.352397-17133-63775666916705=/root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882639.38863: variable 'ansible_module_compression' from source: unknown 15794 1726882639.38986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15794 1726882639.39067: variable 'ansible_facts' from source: unknown 15794 1726882639.39208: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py 15794 1726882639.39364: Sending initial data 15794 1726882639.39368: Sent initial data (166 bytes) 15794 1726882639.39960: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882639.39963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882639.39980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882639.39995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882639.40051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882639.40113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882639.40154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882639.40225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882639.42027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882639.42086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882639.42115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp0ahf31g0 /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py <<< 15794 1726882639.42142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py" <<< 15794 1726882639.42369: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp0ahf31g0" to remote "/root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py" <<< 15794 1726882639.44202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882639.44295: stderr chunk (state=3): >>><<< 15794 1726882639.44298: stdout chunk (state=3): >>><<< 15794 1726882639.44422: done transferring module to remote 15794 1726882639.44426: _low_level_execute_command(): starting 15794 1726882639.44430: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/ /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py && sleep 0' 15794 1726882639.44983: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882639.45050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882639.45120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882639.45142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882639.45166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882639.45230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882639.47448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882639.47452: stdout chunk (state=3): >>><<< 15794 1726882639.47456: stderr chunk (state=3): >>><<< 15794 1726882639.47459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882639.47462: _low_level_execute_command(): starting 15794 1726882639.47465: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/AnsiballZ_network_connections.py && sleep 0' 15794 1726882639.48030: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882639.48049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882639.48064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882639.48088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882639.48154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882639.48211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882639.48231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882639.48272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882639.48383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882639.82225: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15794 1726882639.84594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882639.84600: stdout chunk (state=3): >>><<< 15794 1726882639.84603: stderr chunk (state=3): >>><<< 15794 1726882639.84605: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882639.84742: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882639.84746: _low_level_execute_command(): starting 15794 1726882639.84749: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882639.352397-17133-63775666916705/ > /dev/null 2>&1 && sleep 0' 15794 1726882639.86018: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882639.86239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882639.86349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882639.86415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882639.88328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882639.88521: stderr chunk (state=3): >>><<< 15794 1726882639.88524: stdout chunk (state=3): >>><<< 15794 1726882639.88548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882639.88562: handler run complete 15794 1726882639.88816: attempt loop complete, returning result 15794 1726882639.88819: _execute() done 15794 1726882639.88822: dumping result to json 15794 1726882639.88824: done dumping result, returning 15794 1726882639.88826: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-94e5-e48f-00000000004c] 15794 1726882639.88829: sending task result for task 0affe814-3a2d-94e5-e48f-00000000004c changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15794 1726882639.89123: no more pending results, returning what we have 15794 1726882639.89127: results queue empty 15794 1726882639.89128: checking for any_errors_fatal 15794 1726882639.89138: done checking for any_errors_fatal 15794 1726882639.89140: checking for max_fail_percentage 15794 1726882639.89142: done checking for max_fail_percentage 15794 1726882639.89143: checking to see if all hosts have failed and the running result is not ok 15794 1726882639.89144: done checking to see if all hosts have failed 15794 1726882639.89145: getting the remaining hosts for this loop 15794 1726882639.89148: done getting the remaining hosts for this loop 15794 1726882639.89153: getting the next task for host managed_node1 15794 1726882639.89162: done getting next task for host managed_node1 15794 1726882639.89166: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15794 1726882639.89169: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882639.89186: getting variables 15794 1726882639.89189: in VariableManager get_vars() 15794 1726882639.89232: Calling all_inventory to load vars for managed_node1 15794 1726882639.89642: Calling groups_inventory to load vars for managed_node1 15794 1726882639.89646: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882639.89653: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000004c 15794 1726882639.89656: WORKER PROCESS EXITING 15794 1726882639.89667: Calling all_plugins_play to load vars for managed_node1 15794 1726882639.89671: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882639.89676: Calling groups_plugins_play to load vars for managed_node1 15794 1726882639.93371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882639.96816: done with get_vars() 15794 1726882639.97048: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:19 -0400 (0:00:00.927) 0:00:37.531 ****** 15794 1726882639.97349: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15794 1726882639.98420: worker is 1 (out of 1 available) 15794 1726882639.98436: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15794 1726882639.98565: done queuing things up, now waiting for results queue to drain 15794 1726882639.98567: waiting for pending results... 15794 1726882639.99020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15794 1726882639.99356: in run() - task 0affe814-3a2d-94e5-e48f-00000000004d 15794 1726882639.99372: variable 'ansible_search_path' from source: unknown 15794 1726882639.99376: variable 'ansible_search_path' from source: unknown 15794 1726882639.99439: calling self._execute() 15794 1726882640.00152: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.00200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.00204: variable 'omit' from source: magic vars 15794 1726882640.01219: variable 'ansible_distribution_major_version' from source: facts 15794 1726882640.01231: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882640.01724: variable 'network_state' from source: role '' defaults 15794 1726882640.01729: Evaluated conditional (network_state != {}): False 15794 1726882640.01732: when evaluation is False, skipping this task 15794 1726882640.01736: _execute() done 15794 1726882640.01739: dumping result to json 15794 1726882640.01741: done dumping result, returning 15794 1726882640.01744: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-94e5-e48f-00000000004d] 15794 1726882640.01746: sending task result for task 0affe814-3a2d-94e5-e48f-00000000004d 15794 1726882640.01897: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000004d 15794 1726882640.01900: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882640.01983: no more pending results, returning what we have 15794 1726882640.01987: results queue empty 15794 1726882640.01989: checking for any_errors_fatal 15794 1726882640.01996: done checking for any_errors_fatal 15794 1726882640.01997: checking for max_fail_percentage 15794 1726882640.01999: done checking for max_fail_percentage 15794 1726882640.02000: checking to see if all hosts have failed and the running result is not ok 15794 1726882640.02001: done checking to see if all hosts have failed 15794 1726882640.02002: getting the remaining hosts for this loop 15794 1726882640.02003: done getting the remaining hosts for this loop 15794 1726882640.02008: getting the next task for host managed_node1 15794 1726882640.02014: done getting next task for host managed_node1 15794 1726882640.02018: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15794 1726882640.02021: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882640.02039: getting variables 15794 1726882640.02041: in VariableManager get_vars() 15794 1726882640.02079: Calling all_inventory to load vars for managed_node1 15794 1726882640.02082: Calling groups_inventory to load vars for managed_node1 15794 1726882640.02086: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882640.02098: Calling all_plugins_play to load vars for managed_node1 15794 1726882640.02101: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882640.02106: Calling groups_plugins_play to load vars for managed_node1 15794 1726882640.08017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882640.12821: done with get_vars() 15794 1726882640.13013: done getting variables 15794 1726882640.13340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:20 -0400 (0:00:00.160) 0:00:37.691 ****** 15794 1726882640.13382: entering _queue_task() for managed_node1/debug 15794 1726882640.13955: worker is 1 (out of 1 available) 15794 1726882640.13968: exiting _queue_task() for managed_node1/debug 15794 1726882640.13979: done queuing things up, now waiting for results queue to drain 15794 1726882640.14095: waiting for pending results... 15794 1726882640.14549: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15794 1726882640.14753: in run() - task 0affe814-3a2d-94e5-e48f-00000000004e 15794 1726882640.14802: variable 'ansible_search_path' from source: unknown 15794 1726882640.14806: variable 'ansible_search_path' from source: unknown 15794 1726882640.14848: calling self._execute() 15794 1726882640.15077: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.15082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.15211: variable 'omit' from source: magic vars 15794 1726882640.16189: variable 'ansible_distribution_major_version' from source: facts 15794 1726882640.16201: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882640.16209: variable 'omit' from source: magic vars 15794 1726882640.16441: variable 'omit' from source: magic vars 15794 1726882640.16446: variable 'omit' from source: magic vars 15794 1726882640.16618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882640.16682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882640.16706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882640.16726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882640.16741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882640.16904: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882640.16909: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.16913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.17189: Set connection var ansible_connection to ssh 15794 1726882640.17315: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882640.17324: Set connection var ansible_pipelining to False 15794 1726882640.17332: Set connection var ansible_shell_executable to /bin/sh 15794 1726882640.17338: Set connection var ansible_shell_type to sh 15794 1726882640.17348: Set connection var ansible_timeout to 10 15794 1726882640.17385: variable 'ansible_shell_executable' from source: unknown 15794 1726882640.17389: variable 'ansible_connection' from source: unknown 15794 1726882640.17392: variable 'ansible_module_compression' from source: unknown 15794 1726882640.17471: variable 'ansible_shell_type' from source: unknown 15794 1726882640.17475: variable 'ansible_shell_executable' from source: unknown 15794 1726882640.17478: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.17480: variable 'ansible_pipelining' from source: unknown 15794 1726882640.17581: variable 'ansible_timeout' from source: unknown 15794 1726882640.17585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.17909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882640.17914: variable 'omit' from source: magic vars 15794 1726882640.17963: starting attempt loop 15794 1726882640.17967: running the handler 15794 1726882640.18292: variable '__network_connections_result' from source: set_fact 15794 1726882640.18422: handler run complete 15794 1726882640.18442: attempt loop complete, returning result 15794 1726882640.18451: _execute() done 15794 1726882640.18457: dumping result to json 15794 1726882640.18460: done dumping result, returning 15794 1726882640.18463: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-94e5-e48f-00000000004e] 15794 1726882640.18472: sending task result for task 0affe814-3a2d-94e5-e48f-00000000004e ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15794 1726882640.18800: no more pending results, returning what we have 15794 1726882640.18804: results queue empty 15794 1726882640.18805: checking for any_errors_fatal 15794 1726882640.18813: done checking for any_errors_fatal 15794 1726882640.18814: checking for max_fail_percentage 15794 1726882640.18816: done checking for max_fail_percentage 15794 1726882640.18817: checking to see if all hosts have failed and the running result is not ok 15794 1726882640.18818: done checking to see if all hosts have failed 15794 1726882640.18818: getting the remaining hosts for this loop 15794 1726882640.18821: done getting the remaining hosts for this loop 15794 1726882640.18825: getting the next task for host managed_node1 15794 1726882640.18833: done getting next task for host managed_node1 15794 1726882640.18839: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15794 1726882640.18841: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882640.18853: getting variables 15794 1726882640.18855: in VariableManager get_vars() 15794 1726882640.18896: Calling all_inventory to load vars for managed_node1 15794 1726882640.18900: Calling groups_inventory to load vars for managed_node1 15794 1726882640.18902: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882640.18914: Calling all_plugins_play to load vars for managed_node1 15794 1726882640.18917: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882640.18920: Calling groups_plugins_play to load vars for managed_node1 15794 1726882640.19522: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000004e 15794 1726882640.19525: WORKER PROCESS EXITING 15794 1726882640.23964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882640.28016: done with get_vars() 15794 1726882640.28165: done getting variables 15794 1726882640.28301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:20 -0400 (0:00:00.149) 0:00:37.841 ****** 15794 1726882640.28343: entering _queue_task() for managed_node1/debug 15794 1726882640.28819: worker is 1 (out of 1 available) 15794 1726882640.28836: exiting _queue_task() for managed_node1/debug 15794 1726882640.28849: done queuing things up, now waiting for results queue to drain 15794 1726882640.28851: waiting for pending results... 15794 1726882640.29133: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15794 1726882640.29272: in run() - task 0affe814-3a2d-94e5-e48f-00000000004f 15794 1726882640.29297: variable 'ansible_search_path' from source: unknown 15794 1726882640.29344: variable 'ansible_search_path' from source: unknown 15794 1726882640.29362: calling self._execute() 15794 1726882640.29473: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.29489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.29506: variable 'omit' from source: magic vars 15794 1726882640.30017: variable 'ansible_distribution_major_version' from source: facts 15794 1726882640.30107: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882640.30111: variable 'omit' from source: magic vars 15794 1726882640.30125: variable 'omit' from source: magic vars 15794 1726882640.30184: variable 'omit' from source: magic vars 15794 1726882640.30244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882640.30290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882640.30328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882640.30359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882640.30378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882640.30418: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882640.30433: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.30447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.30639: Set connection var ansible_connection to ssh 15794 1726882640.30643: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882640.30650: Set connection var ansible_pipelining to False 15794 1726882640.30653: Set connection var ansible_shell_executable to /bin/sh 15794 1726882640.30657: Set connection var ansible_shell_type to sh 15794 1726882640.30660: Set connection var ansible_timeout to 10 15794 1726882640.30682: variable 'ansible_shell_executable' from source: unknown 15794 1726882640.30693: variable 'ansible_connection' from source: unknown 15794 1726882640.30702: variable 'ansible_module_compression' from source: unknown 15794 1726882640.30710: variable 'ansible_shell_type' from source: unknown 15794 1726882640.30718: variable 'ansible_shell_executable' from source: unknown 15794 1726882640.30739: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.30742: variable 'ansible_pipelining' from source: unknown 15794 1726882640.30744: variable 'ansible_timeout' from source: unknown 15794 1726882640.30751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.30980: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882640.30986: variable 'omit' from source: magic vars 15794 1726882640.30989: starting attempt loop 15794 1726882640.30991: running the handler 15794 1726882640.31046: variable '__network_connections_result' from source: set_fact 15794 1726882640.31198: variable '__network_connections_result' from source: set_fact 15794 1726882640.31309: handler run complete 15794 1726882640.31355: attempt loop complete, returning result 15794 1726882640.31364: _execute() done 15794 1726882640.31373: dumping result to json 15794 1726882640.31383: done dumping result, returning 15794 1726882640.31397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-94e5-e48f-00000000004f] 15794 1726882640.31426: sending task result for task 0affe814-3a2d-94e5-e48f-00000000004f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15794 1726882640.31644: no more pending results, returning what we have 15794 1726882640.31649: results queue empty 15794 1726882640.31650: checking for any_errors_fatal 15794 1726882640.31656: done checking for any_errors_fatal 15794 1726882640.31657: checking for max_fail_percentage 15794 1726882640.31659: done checking for max_fail_percentage 15794 1726882640.31660: checking to see if all hosts have failed and the running result is not ok 15794 1726882640.31661: done checking to see if all hosts have failed 15794 1726882640.31662: getting the remaining hosts for this loop 15794 1726882640.31665: done getting the remaining hosts for this loop 15794 1726882640.31670: getting the next task for host managed_node1 15794 1726882640.31678: done getting next task for host managed_node1 15794 1726882640.31683: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15794 1726882640.31686: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882640.31700: getting variables 15794 1726882640.31702: in VariableManager get_vars() 15794 1726882640.31950: Calling all_inventory to load vars for managed_node1 15794 1726882640.31954: Calling groups_inventory to load vars for managed_node1 15794 1726882640.31957: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882640.32041: Calling all_plugins_play to load vars for managed_node1 15794 1726882640.32046: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882640.32051: Calling groups_plugins_play to load vars for managed_node1 15794 1726882640.32587: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000004f 15794 1726882640.32591: WORKER PROCESS EXITING 15794 1726882640.34799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882640.39681: done with get_vars() 15794 1726882640.39717: done getting variables 15794 1726882640.39903: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:20 -0400 (0:00:00.115) 0:00:37.957 ****** 15794 1726882640.40039: entering _queue_task() for managed_node1/debug 15794 1726882640.40770: worker is 1 (out of 1 available) 15794 1726882640.40782: exiting _queue_task() for managed_node1/debug 15794 1726882640.40794: done queuing things up, now waiting for results queue to drain 15794 1726882640.40796: waiting for pending results... 15794 1726882640.41073: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15794 1726882640.41442: in run() - task 0affe814-3a2d-94e5-e48f-000000000050 15794 1726882640.41446: variable 'ansible_search_path' from source: unknown 15794 1726882640.41449: variable 'ansible_search_path' from source: unknown 15794 1726882640.41462: calling self._execute() 15794 1726882640.41678: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.41790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.41798: variable 'omit' from source: magic vars 15794 1726882640.42790: variable 'ansible_distribution_major_version' from source: facts 15794 1726882640.42794: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882640.43226: variable 'network_state' from source: role '' defaults 15794 1726882640.43229: Evaluated conditional (network_state != {}): False 15794 1726882640.43232: when evaluation is False, skipping this task 15794 1726882640.43237: _execute() done 15794 1726882640.43239: dumping result to json 15794 1726882640.43242: done dumping result, returning 15794 1726882640.43245: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-94e5-e48f-000000000050] 15794 1726882640.43248: sending task result for task 0affe814-3a2d-94e5-e48f-000000000050 skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15794 1726882640.43414: no more pending results, returning what we have 15794 1726882640.43419: results queue empty 15794 1726882640.43421: checking for any_errors_fatal 15794 1726882640.43433: done checking for any_errors_fatal 15794 1726882640.43440: checking for max_fail_percentage 15794 1726882640.43442: done checking for max_fail_percentage 15794 1726882640.43443: checking to see if all hosts have failed and the running result is not ok 15794 1726882640.43444: done checking to see if all hosts have failed 15794 1726882640.43445: getting the remaining hosts for this loop 15794 1726882640.43448: done getting the remaining hosts for this loop 15794 1726882640.43453: getting the next task for host managed_node1 15794 1726882640.43462: done getting next task for host managed_node1 15794 1726882640.43466: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15794 1726882640.43470: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882640.43489: getting variables 15794 1726882640.43492: in VariableManager get_vars() 15794 1726882640.43839: Calling all_inventory to load vars for managed_node1 15794 1726882640.43843: Calling groups_inventory to load vars for managed_node1 15794 1726882640.43847: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882640.43862: Calling all_plugins_play to load vars for managed_node1 15794 1726882640.43867: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882640.43939: Calling groups_plugins_play to load vars for managed_node1 15794 1726882640.44536: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000050 15794 1726882640.44540: WORKER PROCESS EXITING 15794 1726882640.53480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882640.57442: done with get_vars() 15794 1726882640.57495: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:20 -0400 (0:00:00.176) 0:00:38.134 ****** 15794 1726882640.57606: entering _queue_task() for managed_node1/ping 15794 1726882640.58011: worker is 1 (out of 1 available) 15794 1726882640.58144: exiting _queue_task() for managed_node1/ping 15794 1726882640.58156: done queuing things up, now waiting for results queue to drain 15794 1726882640.58159: waiting for pending results... 15794 1726882640.58395: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15794 1726882640.58563: in run() - task 0affe814-3a2d-94e5-e48f-000000000051 15794 1726882640.58590: variable 'ansible_search_path' from source: unknown 15794 1726882640.58594: variable 'ansible_search_path' from source: unknown 15794 1726882640.58640: calling self._execute() 15794 1726882640.58775: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.58819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.58839: variable 'omit' from source: magic vars 15794 1726882640.59446: variable 'ansible_distribution_major_version' from source: facts 15794 1726882640.59463: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882640.59496: variable 'omit' from source: magic vars 15794 1726882640.59542: variable 'omit' from source: magic vars 15794 1726882640.59607: variable 'omit' from source: magic vars 15794 1726882640.59716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882640.59778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882640.59814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882640.59838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882640.59850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882640.59888: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882640.59892: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.59897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.60018: Set connection var ansible_connection to ssh 15794 1726882640.60038: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882640.60041: Set connection var ansible_pipelining to False 15794 1726882640.60212: Set connection var ansible_shell_executable to /bin/sh 15794 1726882640.60216: Set connection var ansible_shell_type to sh 15794 1726882640.60218: Set connection var ansible_timeout to 10 15794 1726882640.60221: variable 'ansible_shell_executable' from source: unknown 15794 1726882640.60223: variable 'ansible_connection' from source: unknown 15794 1726882640.60227: variable 'ansible_module_compression' from source: unknown 15794 1726882640.60229: variable 'ansible_shell_type' from source: unknown 15794 1726882640.60231: variable 'ansible_shell_executable' from source: unknown 15794 1726882640.60233: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882640.60238: variable 'ansible_pipelining' from source: unknown 15794 1726882640.60240: variable 'ansible_timeout' from source: unknown 15794 1726882640.60242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882640.60642: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882640.60647: variable 'omit' from source: magic vars 15794 1726882640.60650: starting attempt loop 15794 1726882640.60652: running the handler 15794 1726882640.60655: _low_level_execute_command(): starting 15794 1726882640.60658: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882640.61347: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.61421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882640.61557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882640.63303: stdout chunk (state=3): >>>/root <<< 15794 1726882640.63417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882640.63486: stderr chunk (state=3): >>><<< 15794 1726882640.63489: stdout chunk (state=3): >>><<< 15794 1726882640.63516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882640.63532: _low_level_execute_command(): starting 15794 1726882640.63541: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445 `" && echo ansible-tmp-1726882640.6351595-17213-58072450592445="` echo /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445 `" ) && sleep 0' 15794 1726882640.64358: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882640.64362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882640.64365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882640.64368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882640.64371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882640.64374: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882640.64376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.64392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882640.64402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882640.64409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15794 1726882640.64418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882640.64430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882640.64445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882640.64454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882640.64466: stderr chunk (state=3): >>>debug2: match found <<< 15794 1726882640.64471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.64683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882640.64688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882640.64761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882640.64826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882640.66826: stdout chunk (state=3): >>>ansible-tmp-1726882640.6351595-17213-58072450592445=/root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445 <<< 15794 1726882640.67003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882640.67007: stdout chunk (state=3): >>><<< 15794 1726882640.67014: stderr chunk (state=3): >>><<< 15794 1726882640.67158: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882640.6351595-17213-58072450592445=/root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882640.67213: variable 'ansible_module_compression' from source: unknown 15794 1726882640.67261: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15794 1726882640.67306: variable 'ansible_facts' from source: unknown 15794 1726882640.67582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py 15794 1726882640.67968: Sending initial data 15794 1726882640.67972: Sent initial data (152 bytes) 15794 1726882640.69223: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.69299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882640.69305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882640.69401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882640.69578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882640.71210: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882640.71215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882640.71425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpb5eg2nx6 /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py <<< 15794 1726882640.71428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py" <<< 15794 1726882640.71475: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpb5eg2nx6" to remote "/root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py" <<< 15794 1726882640.73509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882640.73595: stderr chunk (state=3): >>><<< 15794 1726882640.73599: stdout chunk (state=3): >>><<< 15794 1726882640.73640: done transferring module to remote 15794 1726882640.73644: _low_level_execute_command(): starting 15794 1726882640.73647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/ /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py && sleep 0' 15794 1726882640.74815: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882640.75155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882640.75164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882640.75240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882640.75265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882640.77165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882640.77244: stderr chunk (state=3): >>><<< 15794 1726882640.77552: stdout chunk (state=3): >>><<< 15794 1726882640.77570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882640.77574: _low_level_execute_command(): starting 15794 1726882640.77581: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/AnsiballZ_ping.py && sleep 0' 15794 1726882640.78232: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882640.78244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882640.78257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882640.78277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882640.78291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882640.78300: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882640.78313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.78327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882640.78338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882640.78346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15794 1726882640.78423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882640.78426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882640.78428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882640.78431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882640.78433: stderr chunk (state=3): >>>debug2: match found <<< 15794 1726882640.78447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.78645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882640.78649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882640.78651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882640.78770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882640.95484: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15794 1726882640.97058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882640.97063: stdout chunk (state=3): >>><<< 15794 1726882640.97065: stderr chunk (state=3): >>><<< 15794 1726882640.97068: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882640.97070: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882640.97073: _low_level_execute_command(): starting 15794 1726882640.97075: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882640.6351595-17213-58072450592445/ > /dev/null 2>&1 && sleep 0' 15794 1726882640.97643: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882640.97653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882640.97666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882640.97688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882640.97702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882640.97713: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882640.97722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882640.97737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882640.97754: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882640.97762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15794 1726882640.97771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882640.97785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882640.97821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882640.97927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882640.97932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882640.97989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882640.99967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882640.99971: stdout chunk (state=3): >>><<< 15794 1726882640.99974: stderr chunk (state=3): >>><<< 15794 1726882641.00140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882641.00149: handler run complete 15794 1726882641.00152: attempt loop complete, returning result 15794 1726882641.00155: _execute() done 15794 1726882641.00157: dumping result to json 15794 1726882641.00160: done dumping result, returning 15794 1726882641.00163: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-94e5-e48f-000000000051] 15794 1726882641.00165: sending task result for task 0affe814-3a2d-94e5-e48f-000000000051 ok: [managed_node1] => { "changed": false, "ping": "pong" } 15794 1726882641.00492: no more pending results, returning what we have 15794 1726882641.00496: results queue empty 15794 1726882641.00497: checking for any_errors_fatal 15794 1726882641.00503: done checking for any_errors_fatal 15794 1726882641.00504: checking for max_fail_percentage 15794 1726882641.00506: done checking for max_fail_percentage 15794 1726882641.00507: checking to see if all hosts have failed and the running result is not ok 15794 1726882641.00508: done checking to see if all hosts have failed 15794 1726882641.00509: getting the remaining hosts for this loop 15794 1726882641.00511: done getting the remaining hosts for this loop 15794 1726882641.00515: getting the next task for host managed_node1 15794 1726882641.00522: done getting next task for host managed_node1 15794 1726882641.00525: ^ task is: TASK: meta (role_complete) 15794 1726882641.00527: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882641.00538: getting variables 15794 1726882641.00540: in VariableManager get_vars() 15794 1726882641.00582: Calling all_inventory to load vars for managed_node1 15794 1726882641.00586: Calling groups_inventory to load vars for managed_node1 15794 1726882641.00594: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882641.00601: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000051 15794 1726882641.00604: WORKER PROCESS EXITING 15794 1726882641.00613: Calling all_plugins_play to load vars for managed_node1 15794 1726882641.00618: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882641.00622: Calling groups_plugins_play to load vars for managed_node1 15794 1726882641.03075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882641.06776: done with get_vars() 15794 1726882641.06814: done getting variables 15794 1726882641.07032: done queuing things up, now waiting for results queue to drain 15794 1726882641.07038: results queue empty 15794 1726882641.07039: checking for any_errors_fatal 15794 1726882641.07043: done checking for any_errors_fatal 15794 1726882641.07044: checking for max_fail_percentage 15794 1726882641.07045: done checking for max_fail_percentage 15794 1726882641.07046: checking to see if all hosts have failed and the running result is not ok 15794 1726882641.07047: done checking to see if all hosts have failed 15794 1726882641.07048: getting the remaining hosts for this loop 15794 1726882641.07049: done getting the remaining hosts for this loop 15794 1726882641.07052: getting the next task for host managed_node1 15794 1726882641.07057: done getting next task for host managed_node1 15794 1726882641.07059: ^ task is: TASK: meta (flush_handlers) 15794 1726882641.07061: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882641.07137: getting variables 15794 1726882641.07139: in VariableManager get_vars() 15794 1726882641.07155: Calling all_inventory to load vars for managed_node1 15794 1726882641.07158: Calling groups_inventory to load vars for managed_node1 15794 1726882641.07161: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882641.07167: Calling all_plugins_play to load vars for managed_node1 15794 1726882641.07170: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882641.07182: Calling groups_plugins_play to load vars for managed_node1 15794 1726882641.09798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882641.13547: done with get_vars() 15794 1726882641.13577: done getting variables 15794 1726882641.13646: in VariableManager get_vars() 15794 1726882641.13663: Calling all_inventory to load vars for managed_node1 15794 1726882641.13666: Calling groups_inventory to load vars for managed_node1 15794 1726882641.13669: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882641.13675: Calling all_plugins_play to load vars for managed_node1 15794 1726882641.13682: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882641.13686: Calling groups_plugins_play to load vars for managed_node1 15794 1726882641.15883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882641.20337: done with get_vars() 15794 1726882641.20377: done queuing things up, now waiting for results queue to drain 15794 1726882641.20382: results queue empty 15794 1726882641.20383: checking for any_errors_fatal 15794 1726882641.20385: done checking for any_errors_fatal 15794 1726882641.20386: checking for max_fail_percentage 15794 1726882641.20387: done checking for max_fail_percentage 15794 1726882641.20388: checking to see if all hosts have failed and the running result is not ok 15794 1726882641.20393: done checking to see if all hosts have failed 15794 1726882641.20394: getting the remaining hosts for this loop 15794 1726882641.20396: done getting the remaining hosts for this loop 15794 1726882641.20399: getting the next task for host managed_node1 15794 1726882641.20404: done getting next task for host managed_node1 15794 1726882641.20406: ^ task is: TASK: meta (flush_handlers) 15794 1726882641.20408: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882641.20411: getting variables 15794 1726882641.20413: in VariableManager get_vars() 15794 1726882641.20545: Calling all_inventory to load vars for managed_node1 15794 1726882641.20549: Calling groups_inventory to load vars for managed_node1 15794 1726882641.20552: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882641.20558: Calling all_plugins_play to load vars for managed_node1 15794 1726882641.20562: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882641.20566: Calling groups_plugins_play to load vars for managed_node1 15794 1726882641.23261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882641.26239: done with get_vars() 15794 1726882641.26272: done getting variables 15794 1726882641.26345: in VariableManager get_vars() 15794 1726882641.26360: Calling all_inventory to load vars for managed_node1 15794 1726882641.26363: Calling groups_inventory to load vars for managed_node1 15794 1726882641.26366: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882641.26373: Calling all_plugins_play to load vars for managed_node1 15794 1726882641.26376: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882641.26383: Calling groups_plugins_play to load vars for managed_node1 15794 1726882641.28445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882641.31461: done with get_vars() 15794 1726882641.31502: done queuing things up, now waiting for results queue to drain 15794 1726882641.31504: results queue empty 15794 1726882641.31506: checking for any_errors_fatal 15794 1726882641.31507: done checking for any_errors_fatal 15794 1726882641.31508: checking for max_fail_percentage 15794 1726882641.31510: done checking for max_fail_percentage 15794 1726882641.31511: checking to see if all hosts have failed and the running result is not ok 15794 1726882641.31512: done checking to see if all hosts have failed 15794 1726882641.31513: getting the remaining hosts for this loop 15794 1726882641.31514: done getting the remaining hosts for this loop 15794 1726882641.31517: getting the next task for host managed_node1 15794 1726882641.31521: done getting next task for host managed_node1 15794 1726882641.31522: ^ task is: None 15794 1726882641.31524: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882641.31526: done queuing things up, now waiting for results queue to drain 15794 1726882641.31527: results queue empty 15794 1726882641.31528: checking for any_errors_fatal 15794 1726882641.31529: done checking for any_errors_fatal 15794 1726882641.31530: checking for max_fail_percentage 15794 1726882641.31531: done checking for max_fail_percentage 15794 1726882641.31532: checking to see if all hosts have failed and the running result is not ok 15794 1726882641.31533: done checking to see if all hosts have failed 15794 1726882641.31536: getting the next task for host managed_node1 15794 1726882641.31539: done getting next task for host managed_node1 15794 1726882641.31540: ^ task is: None 15794 1726882641.31542: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882641.31596: in VariableManager get_vars() 15794 1726882641.31615: done with get_vars() 15794 1726882641.31623: in VariableManager get_vars() 15794 1726882641.31642: done with get_vars() 15794 1726882641.31649: variable 'omit' from source: magic vars 15794 1726882641.31697: in VariableManager get_vars() 15794 1726882641.31711: done with get_vars() 15794 1726882641.31739: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15794 1726882641.31989: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882641.32016: getting the remaining hosts for this loop 15794 1726882641.32018: done getting the remaining hosts for this loop 15794 1726882641.32021: getting the next task for host managed_node1 15794 1726882641.32024: done getting next task for host managed_node1 15794 1726882641.32026: ^ task is: TASK: Gathering Facts 15794 1726882641.32028: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882641.32031: getting variables 15794 1726882641.32032: in VariableManager get_vars() 15794 1726882641.32046: Calling all_inventory to load vars for managed_node1 15794 1726882641.32049: Calling groups_inventory to load vars for managed_node1 15794 1726882641.32052: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882641.32058: Calling all_plugins_play to load vars for managed_node1 15794 1726882641.32062: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882641.32066: Calling groups_plugins_play to load vars for managed_node1 15794 1726882641.34282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882641.37117: done with get_vars() 15794 1726882641.37149: done getting variables 15794 1726882641.37197: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:37:21 -0400 (0:00:00.796) 0:00:38.930 ****** 15794 1726882641.37223: entering _queue_task() for managed_node1/gather_facts 15794 1726882641.37575: worker is 1 (out of 1 available) 15794 1726882641.37588: exiting _queue_task() for managed_node1/gather_facts 15794 1726882641.37599: done queuing things up, now waiting for results queue to drain 15794 1726882641.37601: waiting for pending results... 15794 1726882641.37951: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882641.38082: in run() - task 0affe814-3a2d-94e5-e48f-0000000003f8 15794 1726882641.38123: variable 'ansible_search_path' from source: unknown 15794 1726882641.38155: calling self._execute() 15794 1726882641.38266: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882641.38297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882641.38341: variable 'omit' from source: magic vars 15794 1726882641.38818: variable 'ansible_distribution_major_version' from source: facts 15794 1726882641.38843: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882641.38858: variable 'omit' from source: magic vars 15794 1726882641.38905: variable 'omit' from source: magic vars 15794 1726882641.39007: variable 'omit' from source: magic vars 15794 1726882641.39032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882641.39087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882641.39122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882641.39153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882641.39225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882641.39229: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882641.39236: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882641.39248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882641.39403: Set connection var ansible_connection to ssh 15794 1726882641.39422: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882641.39441: Set connection var ansible_pipelining to False 15794 1726882641.39457: Set connection var ansible_shell_executable to /bin/sh 15794 1726882641.39467: Set connection var ansible_shell_type to sh 15794 1726882641.39540: Set connection var ansible_timeout to 10 15794 1726882641.39549: variable 'ansible_shell_executable' from source: unknown 15794 1726882641.39553: variable 'ansible_connection' from source: unknown 15794 1726882641.39555: variable 'ansible_module_compression' from source: unknown 15794 1726882641.39558: variable 'ansible_shell_type' from source: unknown 15794 1726882641.39565: variable 'ansible_shell_executable' from source: unknown 15794 1726882641.39575: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882641.39644: variable 'ansible_pipelining' from source: unknown 15794 1726882641.39647: variable 'ansible_timeout' from source: unknown 15794 1726882641.39650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882641.39876: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882641.39900: variable 'omit' from source: magic vars 15794 1726882641.39910: starting attempt loop 15794 1726882641.39917: running the handler 15794 1726882641.39939: variable 'ansible_facts' from source: unknown 15794 1726882641.39987: _low_level_execute_command(): starting 15794 1726882641.39990: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882641.40862: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882641.40911: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882641.40981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882641.41050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882641.41105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882641.43017: stdout chunk (state=3): >>>/root <<< 15794 1726882641.43085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882641.43115: stderr chunk (state=3): >>><<< 15794 1726882641.43140: stdout chunk (state=3): >>><<< 15794 1726882641.43168: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882641.43235: _low_level_execute_command(): starting 15794 1726882641.43241: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779 `" && echo ansible-tmp-1726882641.4317544-17243-114091493356779="` echo /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779 `" ) && sleep 0' 15794 1726882641.43895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882641.43913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882641.43928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882641.43953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882641.43972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882641.43985: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882641.44014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882641.44122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882641.44141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882641.44241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882641.46340: stdout chunk (state=3): >>>ansible-tmp-1726882641.4317544-17243-114091493356779=/root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779 <<< 15794 1726882641.46413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882641.46424: stdout chunk (state=3): >>><<< 15794 1726882641.46839: stderr chunk (state=3): >>><<< 15794 1726882641.46844: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882641.4317544-17243-114091493356779=/root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882641.46847: variable 'ansible_module_compression' from source: unknown 15794 1726882641.46850: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882641.46852: variable 'ansible_facts' from source: unknown 15794 1726882641.47207: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py 15794 1726882641.47581: Sending initial data 15794 1726882641.47594: Sent initial data (154 bytes) 15794 1726882641.49402: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882641.49406: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882641.49503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882641.49673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882641.49693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882641.49715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882641.49825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882641.51430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882641.51504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882641.51587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp1o3jr843 /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py <<< 15794 1726882641.51611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py" <<< 15794 1726882641.51659: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp1o3jr843" to remote "/root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py" <<< 15794 1726882641.54145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882641.54237: stderr chunk (state=3): >>><<< 15794 1726882641.54254: stdout chunk (state=3): >>><<< 15794 1726882641.54290: done transferring module to remote 15794 1726882641.54308: _low_level_execute_command(): starting 15794 1726882641.54319: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/ /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py && sleep 0' 15794 1726882641.54946: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882641.54962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882641.54982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882641.55001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882641.55018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882641.55031: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882641.55056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882641.55077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882641.55095: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882641.55149: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882641.55209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882641.55226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882641.55250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882641.55331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882641.57237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882641.57248: stdout chunk (state=3): >>><<< 15794 1726882641.57268: stderr chunk (state=3): >>><<< 15794 1726882641.57291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882641.57302: _low_level_execute_command(): starting 15794 1726882641.57315: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/AnsiballZ_setup.py && sleep 0' 15794 1726882641.57963: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882641.58053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882641.58110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882641.58126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882641.58149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882641.58237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882642.27787: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.50927734375, "5m": 0.44189453125, "15m": 0.2216796875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "21", "epoch": "1726882641", "epoch_int": "1726882641", "date": "2024-09-20", "time": "21:37:21", "iso8601_micro": "2024-09-21T01:37:21.893135Z", "iso8601": "2024-09-21T01:37:21Z", "iso8601_basic": "20240920T213721893135", "iso8601_basic_short": "20240920T213721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2870, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 847, "free": 2870}, "nocache": {"free": 3475, "used": 242}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 596, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205234688, "block_size": 4096, "block_total": 64483404, "block_available": 61329403, "block_used": 3154001, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_interfaces": ["lsr27", "peerlsr27", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b471:fa1a:61d2:e391", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391", "fe80::bb10:9a17:6b35:7604"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882642.29716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882642.29796: stderr chunk (state=3): >>><<< 15794 1726882642.29808: stdout chunk (state=3): >>><<< 15794 1726882642.29853: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.50927734375, "5m": 0.44189453125, "15m": 0.2216796875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "21", "epoch": "1726882641", "epoch_int": "1726882641", "date": "2024-09-20", "time": "21:37:21", "iso8601_micro": "2024-09-21T01:37:21.893135Z", "iso8601": "2024-09-21T01:37:21Z", "iso8601_basic": "20240920T213721893135", "iso8601_basic_short": "20240920T213721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2870, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 847, "free": 2870}, "nocache": {"free": 3475, "used": 242}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 596, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205234688, "block_size": 4096, "block_total": 64483404, "block_available": 61329403, "block_used": 3154001, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_interfaces": ["lsr27", "peerlsr27", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:e0:28:bd:b9:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::34e0:28ff:febd:b99f", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "46:97:5a:58:86:a9", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b471:fa1a:61d2:e391", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::34e0:28ff:febd:b99f", "fe80::b471:fa1a:61d2:e391", "fe80::bb10:9a17:6b35:7604"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882642.30295: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882642.30325: _low_level_execute_command(): starting 15794 1726882642.30417: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882641.4317544-17243-114091493356779/ > /dev/null 2>&1 && sleep 0' 15794 1726882642.30956: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882642.30971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882642.30993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882642.31050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882642.31119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882642.31140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882642.31157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882642.31244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882642.33209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882642.33221: stdout chunk (state=3): >>><<< 15794 1726882642.33237: stderr chunk (state=3): >>><<< 15794 1726882642.33259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882642.33273: handler run complete 15794 1726882642.33645: variable 'ansible_facts' from source: unknown 15794 1726882642.33648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.34122: variable 'ansible_facts' from source: unknown 15794 1726882642.34257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.34470: attempt loop complete, returning result 15794 1726882642.34481: _execute() done 15794 1726882642.34491: dumping result to json 15794 1726882642.34539: done dumping result, returning 15794 1726882642.34554: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-0000000003f8] 15794 1726882642.34566: sending task result for task 0affe814-3a2d-94e5-e48f-0000000003f8 ok: [managed_node1] 15794 1726882642.35507: no more pending results, returning what we have 15794 1726882642.35511: results queue empty 15794 1726882642.35512: checking for any_errors_fatal 15794 1726882642.35514: done checking for any_errors_fatal 15794 1726882642.35515: checking for max_fail_percentage 15794 1726882642.35517: done checking for max_fail_percentage 15794 1726882642.35518: checking to see if all hosts have failed and the running result is not ok 15794 1726882642.35519: done checking to see if all hosts have failed 15794 1726882642.35520: getting the remaining hosts for this loop 15794 1726882642.35522: done getting the remaining hosts for this loop 15794 1726882642.35526: getting the next task for host managed_node1 15794 1726882642.35772: done getting next task for host managed_node1 15794 1726882642.35777: ^ task is: TASK: meta (flush_handlers) 15794 1726882642.35779: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882642.35784: getting variables 15794 1726882642.35786: in VariableManager get_vars() 15794 1726882642.35811: Calling all_inventory to load vars for managed_node1 15794 1726882642.35814: Calling groups_inventory to load vars for managed_node1 15794 1726882642.35818: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882642.35826: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000003f8 15794 1726882642.35829: WORKER PROCESS EXITING 15794 1726882642.35841: Calling all_plugins_play to load vars for managed_node1 15794 1726882642.35845: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882642.35850: Calling groups_plugins_play to load vars for managed_node1 15794 1726882642.38155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.41085: done with get_vars() 15794 1726882642.41129: done getting variables 15794 1726882642.41213: in VariableManager get_vars() 15794 1726882642.41232: Calling all_inventory to load vars for managed_node1 15794 1726882642.41237: Calling groups_inventory to load vars for managed_node1 15794 1726882642.41241: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882642.41247: Calling all_plugins_play to load vars for managed_node1 15794 1726882642.41251: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882642.41255: Calling groups_plugins_play to load vars for managed_node1 15794 1726882642.43391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.46293: done with get_vars() 15794 1726882642.46341: done queuing things up, now waiting for results queue to drain 15794 1726882642.46344: results queue empty 15794 1726882642.46345: checking for any_errors_fatal 15794 1726882642.46351: done checking for any_errors_fatal 15794 1726882642.46352: checking for max_fail_percentage 15794 1726882642.46353: done checking for max_fail_percentage 15794 1726882642.46358: checking to see if all hosts have failed and the running result is not ok 15794 1726882642.46359: done checking to see if all hosts have failed 15794 1726882642.46360: getting the remaining hosts for this loop 15794 1726882642.46361: done getting the remaining hosts for this loop 15794 1726882642.46365: getting the next task for host managed_node1 15794 1726882642.46370: done getting next task for host managed_node1 15794 1726882642.46373: ^ task is: TASK: Include the task 'delete_interface.yml' 15794 1726882642.46375: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882642.46378: getting variables 15794 1726882642.46379: in VariableManager get_vars() 15794 1726882642.46390: Calling all_inventory to load vars for managed_node1 15794 1726882642.46393: Calling groups_inventory to load vars for managed_node1 15794 1726882642.46397: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882642.46403: Calling all_plugins_play to load vars for managed_node1 15794 1726882642.46407: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882642.46411: Calling groups_plugins_play to load vars for managed_node1 15794 1726882642.48479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.52895: done with get_vars() 15794 1726882642.52931: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:37:22 -0400 (0:00:01.158) 0:00:40.088 ****** 15794 1726882642.53028: entering _queue_task() for managed_node1/include_tasks 15794 1726882642.53516: worker is 1 (out of 1 available) 15794 1726882642.53542: exiting _queue_task() for managed_node1/include_tasks 15794 1726882642.53555: done queuing things up, now waiting for results queue to drain 15794 1726882642.53556: waiting for pending results... 15794 1726882642.53780: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 15794 1726882642.53981: in run() - task 0affe814-3a2d-94e5-e48f-000000000054 15794 1726882642.54055: variable 'ansible_search_path' from source: unknown 15794 1726882642.54243: calling self._execute() 15794 1726882642.54414: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882642.54641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882642.54646: variable 'omit' from source: magic vars 15794 1726882642.55453: variable 'ansible_distribution_major_version' from source: facts 15794 1726882642.55558: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882642.55580: _execute() done 15794 1726882642.55590: dumping result to json 15794 1726882642.55679: done dumping result, returning 15794 1726882642.55683: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0affe814-3a2d-94e5-e48f-000000000054] 15794 1726882642.55686: sending task result for task 0affe814-3a2d-94e5-e48f-000000000054 15794 1726882642.55982: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000054 15794 1726882642.55986: WORKER PROCESS EXITING 15794 1726882642.56039: no more pending results, returning what we have 15794 1726882642.56047: in VariableManager get_vars() 15794 1726882642.56087: Calling all_inventory to load vars for managed_node1 15794 1726882642.56092: Calling groups_inventory to load vars for managed_node1 15794 1726882642.56097: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882642.56119: Calling all_plugins_play to load vars for managed_node1 15794 1726882642.56124: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882642.56129: Calling groups_plugins_play to load vars for managed_node1 15794 1726882642.59388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.63381: done with get_vars() 15794 1726882642.63419: variable 'ansible_search_path' from source: unknown 15794 1726882642.63439: we have included files to process 15794 1726882642.63440: generating all_blocks data 15794 1726882642.63442: done generating all_blocks data 15794 1726882642.63443: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15794 1726882642.63444: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15794 1726882642.63448: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15794 1726882642.63734: done processing included file 15794 1726882642.63738: iterating over new_blocks loaded from include file 15794 1726882642.63739: in VariableManager get_vars() 15794 1726882642.63755: done with get_vars() 15794 1726882642.63757: filtering new block on tags 15794 1726882642.63781: done filtering new block on tags 15794 1726882642.63784: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 15794 1726882642.63790: extending task lists for all hosts with included blocks 15794 1726882642.63830: done extending task lists 15794 1726882642.63832: done processing included files 15794 1726882642.63833: results queue empty 15794 1726882642.63839: checking for any_errors_fatal 15794 1726882642.63841: done checking for any_errors_fatal 15794 1726882642.63842: checking for max_fail_percentage 15794 1726882642.63843: done checking for max_fail_percentage 15794 1726882642.63844: checking to see if all hosts have failed and the running result is not ok 15794 1726882642.63845: done checking to see if all hosts have failed 15794 1726882642.63846: getting the remaining hosts for this loop 15794 1726882642.63848: done getting the remaining hosts for this loop 15794 1726882642.63851: getting the next task for host managed_node1 15794 1726882642.63855: done getting next task for host managed_node1 15794 1726882642.63858: ^ task is: TASK: Remove test interface if necessary 15794 1726882642.63861: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882642.63863: getting variables 15794 1726882642.63865: in VariableManager get_vars() 15794 1726882642.63875: Calling all_inventory to load vars for managed_node1 15794 1726882642.63878: Calling groups_inventory to load vars for managed_node1 15794 1726882642.63884: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882642.63891: Calling all_plugins_play to load vars for managed_node1 15794 1726882642.63894: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882642.63898: Calling groups_plugins_play to load vars for managed_node1 15794 1726882642.66937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882642.70237: done with get_vars() 15794 1726882642.70275: done getting variables 15794 1726882642.70336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:37:22 -0400 (0:00:00.173) 0:00:40.261 ****** 15794 1726882642.70373: entering _queue_task() for managed_node1/command 15794 1726882642.70750: worker is 1 (out of 1 available) 15794 1726882642.70880: exiting _queue_task() for managed_node1/command 15794 1726882642.70892: done queuing things up, now waiting for results queue to drain 15794 1726882642.70894: waiting for pending results... 15794 1726882642.71129: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 15794 1726882642.71353: in run() - task 0affe814-3a2d-94e5-e48f-000000000409 15794 1726882642.71357: variable 'ansible_search_path' from source: unknown 15794 1726882642.71362: variable 'ansible_search_path' from source: unknown 15794 1726882642.71365: calling self._execute() 15794 1726882642.71476: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882642.71487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882642.71500: variable 'omit' from source: magic vars 15794 1726882642.72003: variable 'ansible_distribution_major_version' from source: facts 15794 1726882642.72007: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882642.72010: variable 'omit' from source: magic vars 15794 1726882642.72074: variable 'omit' from source: magic vars 15794 1726882642.72218: variable 'interface' from source: set_fact 15794 1726882642.72237: variable 'omit' from source: magic vars 15794 1726882642.72286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882642.72335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882642.72358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882642.72392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882642.72395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882642.72539: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882642.72542: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882642.72547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882642.72597: Set connection var ansible_connection to ssh 15794 1726882642.72601: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882642.72610: Set connection var ansible_pipelining to False 15794 1726882642.72618: Set connection var ansible_shell_executable to /bin/sh 15794 1726882642.72630: Set connection var ansible_shell_type to sh 15794 1726882642.72644: Set connection var ansible_timeout to 10 15794 1726882642.72677: variable 'ansible_shell_executable' from source: unknown 15794 1726882642.72681: variable 'ansible_connection' from source: unknown 15794 1726882642.72687: variable 'ansible_module_compression' from source: unknown 15794 1726882642.72690: variable 'ansible_shell_type' from source: unknown 15794 1726882642.72695: variable 'ansible_shell_executable' from source: unknown 15794 1726882642.72701: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882642.72706: variable 'ansible_pipelining' from source: unknown 15794 1726882642.72708: variable 'ansible_timeout' from source: unknown 15794 1726882642.72715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882642.72908: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882642.72917: variable 'omit' from source: magic vars 15794 1726882642.72924: starting attempt loop 15794 1726882642.72928: running the handler 15794 1726882642.72946: _low_level_execute_command(): starting 15794 1726882642.72964: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882642.74044: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882642.74049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882642.74053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882642.74144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882642.74237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882642.75992: stdout chunk (state=3): >>>/root <<< 15794 1726882642.76302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882642.76306: stderr chunk (state=3): >>><<< 15794 1726882642.76309: stdout chunk (state=3): >>><<< 15794 1726882642.76312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882642.76315: _low_level_execute_command(): starting 15794 1726882642.76318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177 `" && echo ansible-tmp-1726882642.7619681-17311-207951216075177="` echo /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177 `" ) && sleep 0' 15794 1726882642.76878: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882642.76895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882642.76967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882642.77049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882642.77091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882642.77130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882642.77208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882642.79182: stdout chunk (state=3): >>>ansible-tmp-1726882642.7619681-17311-207951216075177=/root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177 <<< 15794 1726882642.79360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882642.79390: stderr chunk (state=3): >>><<< 15794 1726882642.79401: stdout chunk (state=3): >>><<< 15794 1726882642.79429: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882642.7619681-17311-207951216075177=/root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882642.79479: variable 'ansible_module_compression' from source: unknown 15794 1726882642.79578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882642.79596: variable 'ansible_facts' from source: unknown 15794 1726882642.79675: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py 15794 1726882642.79865: Sending initial data 15794 1726882642.79868: Sent initial data (156 bytes) 15794 1726882642.80553: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882642.80593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882642.80617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882642.80644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882642.80722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882642.82313: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882642.82567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882642.82640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp55pe8zy_ /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py <<< 15794 1726882642.82644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py" <<< 15794 1726882642.82723: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp55pe8zy_" to remote "/root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py" <<< 15794 1726882642.83894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882642.84009: stderr chunk (state=3): >>><<< 15794 1726882642.84026: stdout chunk (state=3): >>><<< 15794 1726882642.84060: done transferring module to remote 15794 1726882642.84079: _low_level_execute_command(): starting 15794 1726882642.84098: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/ /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py && sleep 0' 15794 1726882642.84807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882642.84853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882642.84945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882642.84983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882642.85066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882642.86973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882642.86976: stdout chunk (state=3): >>><<< 15794 1726882642.86979: stderr chunk (state=3): >>><<< 15794 1726882642.87041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882642.87048: _low_level_execute_command(): starting 15794 1726882642.87052: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/AnsiballZ_command.py && sleep 0' 15794 1726882642.87735: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882642.87760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882642.87775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882642.87796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882642.87875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882642.87924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882642.87943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882642.87969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882642.88068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882643.06847: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:37:23.049720", "end": "2024-09-20 21:37:23.059930", "delta": "0:00:00.010210", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882643.08821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882643.08839: stdout chunk (state=3): >>><<< 15794 1726882643.08854: stderr chunk (state=3): >>><<< 15794 1726882643.08879: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:37:23.049720", "end": "2024-09-20 21:37:23.059930", "delta": "0:00:00.010210", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882643.09031: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882643.09037: _low_level_execute_command(): starting 15794 1726882643.09040: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882642.7619681-17311-207951216075177/ > /dev/null 2>&1 && sleep 0' 15794 1726882643.09588: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882643.09605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882643.09621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882643.09646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882643.09666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882643.09773: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882643.09807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882643.09920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882643.11825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882643.11904: stderr chunk (state=3): >>><<< 15794 1726882643.11907: stdout chunk (state=3): >>><<< 15794 1726882643.11928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882643.11937: handler run complete 15794 1726882643.11966: Evaluated conditional (False): False 15794 1726882643.11981: attempt loop complete, returning result 15794 1726882643.11987: _execute() done 15794 1726882643.11991: dumping result to json 15794 1726882643.11999: done dumping result, returning 15794 1726882643.12009: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0affe814-3a2d-94e5-e48f-000000000409] 15794 1726882643.12015: sending task result for task 0affe814-3a2d-94e5-e48f-000000000409 15794 1726882643.12324: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000409 15794 1726882643.12327: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.010210", "end": "2024-09-20 21:37:23.059930", "rc": 0, "start": "2024-09-20 21:37:23.049720" } 15794 1726882643.12439: no more pending results, returning what we have 15794 1726882643.12443: results queue empty 15794 1726882643.12444: checking for any_errors_fatal 15794 1726882643.12446: done checking for any_errors_fatal 15794 1726882643.12447: checking for max_fail_percentage 15794 1726882643.12449: done checking for max_fail_percentage 15794 1726882643.12450: checking to see if all hosts have failed and the running result is not ok 15794 1726882643.12451: done checking to see if all hosts have failed 15794 1726882643.12452: getting the remaining hosts for this loop 15794 1726882643.12454: done getting the remaining hosts for this loop 15794 1726882643.12458: getting the next task for host managed_node1 15794 1726882643.12466: done getting next task for host managed_node1 15794 1726882643.12469: ^ task is: TASK: meta (flush_handlers) 15794 1726882643.12471: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882643.12475: getting variables 15794 1726882643.12477: in VariableManager get_vars() 15794 1726882643.12509: Calling all_inventory to load vars for managed_node1 15794 1726882643.12512: Calling groups_inventory to load vars for managed_node1 15794 1726882643.12516: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882643.12526: Calling all_plugins_play to load vars for managed_node1 15794 1726882643.12529: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882643.12532: Calling groups_plugins_play to load vars for managed_node1 15794 1726882643.15152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882643.19202: done with get_vars() 15794 1726882643.19244: done getting variables 15794 1726882643.19336: in VariableManager get_vars() 15794 1726882643.19349: Calling all_inventory to load vars for managed_node1 15794 1726882643.19353: Calling groups_inventory to load vars for managed_node1 15794 1726882643.19356: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882643.19362: Calling all_plugins_play to load vars for managed_node1 15794 1726882643.19365: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882643.19369: Calling groups_plugins_play to load vars for managed_node1 15794 1726882643.22452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882643.25377: done with get_vars() 15794 1726882643.25423: done queuing things up, now waiting for results queue to drain 15794 1726882643.25426: results queue empty 15794 1726882643.25427: checking for any_errors_fatal 15794 1726882643.25432: done checking for any_errors_fatal 15794 1726882643.25433: checking for max_fail_percentage 15794 1726882643.25435: done checking for max_fail_percentage 15794 1726882643.25436: checking to see if all hosts have failed and the running result is not ok 15794 1726882643.25437: done checking to see if all hosts have failed 15794 1726882643.25438: getting the remaining hosts for this loop 15794 1726882643.25440: done getting the remaining hosts for this loop 15794 1726882643.25443: getting the next task for host managed_node1 15794 1726882643.25447: done getting next task for host managed_node1 15794 1726882643.25449: ^ task is: TASK: meta (flush_handlers) 15794 1726882643.25451: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882643.25454: getting variables 15794 1726882643.25455: in VariableManager get_vars() 15794 1726882643.25467: Calling all_inventory to load vars for managed_node1 15794 1726882643.25470: Calling groups_inventory to load vars for managed_node1 15794 1726882643.25473: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882643.25479: Calling all_plugins_play to load vars for managed_node1 15794 1726882643.25483: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882643.25487: Calling groups_plugins_play to load vars for managed_node1 15794 1726882643.27550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882643.30439: done with get_vars() 15794 1726882643.30472: done getting variables 15794 1726882643.30545: in VariableManager get_vars() 15794 1726882643.30558: Calling all_inventory to load vars for managed_node1 15794 1726882643.30561: Calling groups_inventory to load vars for managed_node1 15794 1726882643.30564: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882643.30570: Calling all_plugins_play to load vars for managed_node1 15794 1726882643.30573: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882643.30576: Calling groups_plugins_play to load vars for managed_node1 15794 1726882643.32549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882643.35453: done with get_vars() 15794 1726882643.35503: done queuing things up, now waiting for results queue to drain 15794 1726882643.35506: results queue empty 15794 1726882643.35508: checking for any_errors_fatal 15794 1726882643.35509: done checking for any_errors_fatal 15794 1726882643.35510: checking for max_fail_percentage 15794 1726882643.35512: done checking for max_fail_percentage 15794 1726882643.35513: checking to see if all hosts have failed and the running result is not ok 15794 1726882643.35514: done checking to see if all hosts have failed 15794 1726882643.35515: getting the remaining hosts for this loop 15794 1726882643.35516: done getting the remaining hosts for this loop 15794 1726882643.35519: getting the next task for host managed_node1 15794 1726882643.35523: done getting next task for host managed_node1 15794 1726882643.35525: ^ task is: None 15794 1726882643.35527: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882643.35528: done queuing things up, now waiting for results queue to drain 15794 1726882643.35529: results queue empty 15794 1726882643.35530: checking for any_errors_fatal 15794 1726882643.35531: done checking for any_errors_fatal 15794 1726882643.35532: checking for max_fail_percentage 15794 1726882643.35535: done checking for max_fail_percentage 15794 1726882643.35536: checking to see if all hosts have failed and the running result is not ok 15794 1726882643.35537: done checking to see if all hosts have failed 15794 1726882643.35539: getting the next task for host managed_node1 15794 1726882643.35546: done getting next task for host managed_node1 15794 1726882643.35547: ^ task is: None 15794 1726882643.35549: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882643.35599: in VariableManager get_vars() 15794 1726882643.35629: done with get_vars() 15794 1726882643.35640: in VariableManager get_vars() 15794 1726882643.35658: done with get_vars() 15794 1726882643.35663: variable 'omit' from source: magic vars 15794 1726882643.35814: variable 'profile' from source: play vars 15794 1726882643.35944: in VariableManager get_vars() 15794 1726882643.35962: done with get_vars() 15794 1726882643.35988: variable 'omit' from source: magic vars 15794 1726882643.36076: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15794 1726882643.37091: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882643.37120: getting the remaining hosts for this loop 15794 1726882643.37121: done getting the remaining hosts for this loop 15794 1726882643.37124: getting the next task for host managed_node1 15794 1726882643.37127: done getting next task for host managed_node1 15794 1726882643.37130: ^ task is: TASK: Gathering Facts 15794 1726882643.37132: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882643.37135: getting variables 15794 1726882643.37137: in VariableManager get_vars() 15794 1726882643.37149: Calling all_inventory to load vars for managed_node1 15794 1726882643.37152: Calling groups_inventory to load vars for managed_node1 15794 1726882643.37155: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882643.37161: Calling all_plugins_play to load vars for managed_node1 15794 1726882643.37164: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882643.37167: Calling groups_plugins_play to load vars for managed_node1 15794 1726882643.43926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882643.49607: done with get_vars() 15794 1726882643.49652: done getting variables 15794 1726882643.49712: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:37:23 -0400 (0:00:00.793) 0:00:41.057 ****** 15794 1726882643.49945: entering _queue_task() for managed_node1/gather_facts 15794 1726882643.50503: worker is 1 (out of 1 available) 15794 1726882643.50516: exiting _queue_task() for managed_node1/gather_facts 15794 1726882643.50531: done queuing things up, now waiting for results queue to drain 15794 1726882643.50532: waiting for pending results... 15794 1726882643.51150: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882643.51277: in run() - task 0affe814-3a2d-94e5-e48f-000000000417 15794 1726882643.51368: variable 'ansible_search_path' from source: unknown 15794 1726882643.51411: calling self._execute() 15794 1726882643.51803: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882643.51809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882643.51812: variable 'omit' from source: magic vars 15794 1726882643.52617: variable 'ansible_distribution_major_version' from source: facts 15794 1726882643.52778: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882643.52784: variable 'omit' from source: magic vars 15794 1726882643.52815: variable 'omit' from source: magic vars 15794 1726882643.52935: variable 'omit' from source: magic vars 15794 1726882643.52989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882643.53327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882643.53332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882643.53337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882643.53340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882643.53373: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882643.53387: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882643.53396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882643.53541: Set connection var ansible_connection to ssh 15794 1726882643.53665: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882643.53868: Set connection var ansible_pipelining to False 15794 1726882643.53872: Set connection var ansible_shell_executable to /bin/sh 15794 1726882643.53874: Set connection var ansible_shell_type to sh 15794 1726882643.53876: Set connection var ansible_timeout to 10 15794 1726882643.53882: variable 'ansible_shell_executable' from source: unknown 15794 1726882643.53884: variable 'ansible_connection' from source: unknown 15794 1726882643.53887: variable 'ansible_module_compression' from source: unknown 15794 1726882643.53890: variable 'ansible_shell_type' from source: unknown 15794 1726882643.53892: variable 'ansible_shell_executable' from source: unknown 15794 1726882643.53894: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882643.53896: variable 'ansible_pipelining' from source: unknown 15794 1726882643.53898: variable 'ansible_timeout' from source: unknown 15794 1726882643.53901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882643.54506: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882643.54540: variable 'omit' from source: magic vars 15794 1726882643.54543: starting attempt loop 15794 1726882643.54741: running the handler 15794 1726882643.54745: variable 'ansible_facts' from source: unknown 15794 1726882643.54748: _low_level_execute_command(): starting 15794 1726882643.54751: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882643.56243: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882643.56321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882643.56363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882643.56455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882643.56495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882643.58341: stdout chunk (state=3): >>>/root <<< 15794 1726882643.58468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882643.58616: stderr chunk (state=3): >>><<< 15794 1726882643.58620: stdout chunk (state=3): >>><<< 15794 1726882643.58644: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882643.58765: _low_level_execute_command(): starting 15794 1726882643.58778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713 `" && echo ansible-tmp-1726882643.5867302-17344-106541967382713="` echo /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713 `" ) && sleep 0' 15794 1726882643.59490: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882643.59507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882643.59521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882643.59565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882643.59678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882643.59704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882643.59723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882643.59893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882643.61906: stdout chunk (state=3): >>>ansible-tmp-1726882643.5867302-17344-106541967382713=/root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713 <<< 15794 1726882643.62332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882643.62338: stdout chunk (state=3): >>><<< 15794 1726882643.62341: stderr chunk (state=3): >>><<< 15794 1726882643.62344: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882643.5867302-17344-106541967382713=/root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882643.62347: variable 'ansible_module_compression' from source: unknown 15794 1726882643.62558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882643.62716: variable 'ansible_facts' from source: unknown 15794 1726882643.63118: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py 15794 1726882643.63658: Sending initial data 15794 1726882643.63661: Sent initial data (154 bytes) 15794 1726882643.64225: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882643.64332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882643.64356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882643.64374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882643.64473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882643.66127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882643.66185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882643.66238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp09g02qrk /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py <<< 15794 1726882643.66259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py" <<< 15794 1726882643.66347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp09g02qrk" to remote "/root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py" <<< 15794 1726882643.69482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882643.69520: stderr chunk (state=3): >>><<< 15794 1726882643.69535: stdout chunk (state=3): >>><<< 15794 1726882643.69591: done transferring module to remote 15794 1726882643.69609: _low_level_execute_command(): starting 15794 1726882643.69649: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/ /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py && sleep 0' 15794 1726882643.70871: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882643.70931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882643.71114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882643.71248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882643.73189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882643.73216: stdout chunk (state=3): >>><<< 15794 1726882643.73229: stderr chunk (state=3): >>><<< 15794 1726882643.73254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882643.73269: _low_level_execute_command(): starting 15794 1726882643.73284: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/AnsiballZ_setup.py && sleep 0' 15794 1726882643.73942: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882643.73959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882643.73976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882643.74008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882643.74348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882643.74373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882643.74416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882643.74492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882644.42871: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.50927734375, "5m": 0.44189453125, "15m": 0.2216796875}, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2866, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 851, "free": 2866}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 598, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205214208, "block_size": 4096, "block_total": 64483404, "block_available": 61329398, "block_used": 3154006, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_fips": false, "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "24", "epoch": "1726882644", "epoch_int": "1726882644", "date": "2024-09-20", "time": "21:37:24", "iso8601_micro": "2024-09-21T01:37:24.395349Z", "iso8601": "2024-09-21T01:37:24Z", "iso8601_basic": "20240920T213724395349", "iso8601_basic_short": "20240920T213724", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882644.44662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882644.44682: stderr chunk (state=3): >>>Shared connection to 10.31.10.217 closed. <<< 15794 1726882644.45000: stderr chunk (state=3): >>><<< 15794 1726882644.45004: stdout chunk (state=3): >>><<< 15794 1726882644.45087: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.50927734375, "5m": 0.44189453125, "15m": 0.2216796875}, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2866, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 851, "free": 2866}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 598, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205214208, "block_size": 4096, "block_total": 64483404, "block_available": 61329398, "block_used": 3154006, "inode_total": 16384000, "inode_available": 16303773, "inode_used": 80227, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_fips": false, "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "24", "epoch": "1726882644", "epoch_int": "1726882644", "date": "2024-09-20", "time": "21:37:24", "iso8601_micro": "2024-09-21T01:37:24.395349Z", "iso8601": "2024-09-21T01:37:24Z", "iso8601_basic": "20240920T213724395349", "iso8601_basic_short": "20240920T213724", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882644.45831: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882644.45958: _low_level_execute_command(): starting 15794 1726882644.45972: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882643.5867302-17344-106541967382713/ > /dev/null 2>&1 && sleep 0' 15794 1726882644.47472: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882644.47804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882644.47903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882644.49977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882644.49981: stdout chunk (state=3): >>><<< 15794 1726882644.49984: stderr chunk (state=3): >>><<< 15794 1726882644.50067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882644.50084: handler run complete 15794 1726882644.50477: variable 'ansible_facts' from source: unknown 15794 1726882644.50846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882644.51742: variable 'ansible_facts' from source: unknown 15794 1726882644.51923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882644.52696: attempt loop complete, returning result 15794 1726882644.52708: _execute() done 15794 1726882644.52716: dumping result to json 15794 1726882644.52754: done dumping result, returning 15794 1726882644.52770: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-000000000417] 15794 1726882644.52791: sending task result for task 0affe814-3a2d-94e5-e48f-000000000417 15794 1726882644.53368: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000417 15794 1726882644.53371: WORKER PROCESS EXITING ok: [managed_node1] 15794 1726882644.54316: no more pending results, returning what we have 15794 1726882644.54320: results queue empty 15794 1726882644.54321: checking for any_errors_fatal 15794 1726882644.54322: done checking for any_errors_fatal 15794 1726882644.54323: checking for max_fail_percentage 15794 1726882644.54325: done checking for max_fail_percentage 15794 1726882644.54326: checking to see if all hosts have failed and the running result is not ok 15794 1726882644.54327: done checking to see if all hosts have failed 15794 1726882644.54328: getting the remaining hosts for this loop 15794 1726882644.54329: done getting the remaining hosts for this loop 15794 1726882644.54397: getting the next task for host managed_node1 15794 1726882644.54404: done getting next task for host managed_node1 15794 1726882644.54407: ^ task is: TASK: meta (flush_handlers) 15794 1726882644.54410: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882644.54414: getting variables 15794 1726882644.54416: in VariableManager get_vars() 15794 1726882644.54450: Calling all_inventory to load vars for managed_node1 15794 1726882644.54453: Calling groups_inventory to load vars for managed_node1 15794 1726882644.54456: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882644.54468: Calling all_plugins_play to load vars for managed_node1 15794 1726882644.54471: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882644.54475: Calling groups_plugins_play to load vars for managed_node1 15794 1726882644.59012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882644.65145: done with get_vars() 15794 1726882644.65190: done getting variables 15794 1726882644.65487: in VariableManager get_vars() 15794 1726882644.65506: Calling all_inventory to load vars for managed_node1 15794 1726882644.65509: Calling groups_inventory to load vars for managed_node1 15794 1726882644.65512: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882644.65518: Calling all_plugins_play to load vars for managed_node1 15794 1726882644.65521: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882644.65525: Calling groups_plugins_play to load vars for managed_node1 15794 1726882644.70194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882644.76293: done with get_vars() 15794 1726882644.76346: done queuing things up, now waiting for results queue to drain 15794 1726882644.76349: results queue empty 15794 1726882644.76350: checking for any_errors_fatal 15794 1726882644.76357: done checking for any_errors_fatal 15794 1726882644.76358: checking for max_fail_percentage 15794 1726882644.76360: done checking for max_fail_percentage 15794 1726882644.76361: checking to see if all hosts have failed and the running result is not ok 15794 1726882644.76362: done checking to see if all hosts have failed 15794 1726882644.76363: getting the remaining hosts for this loop 15794 1726882644.76364: done getting the remaining hosts for this loop 15794 1726882644.76372: getting the next task for host managed_node1 15794 1726882644.76377: done getting next task for host managed_node1 15794 1726882644.76385: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15794 1726882644.76387: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882644.76400: getting variables 15794 1726882644.76401: in VariableManager get_vars() 15794 1726882644.76423: Calling all_inventory to load vars for managed_node1 15794 1726882644.76426: Calling groups_inventory to load vars for managed_node1 15794 1726882644.76428: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882644.76531: Calling all_plugins_play to load vars for managed_node1 15794 1726882644.76538: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882644.76543: Calling groups_plugins_play to load vars for managed_node1 15794 1726882644.78808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882644.83688: done with get_vars() 15794 1726882644.83724: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:24 -0400 (0:00:01.338) 0:00:42.396 ****** 15794 1726882644.83815: entering _queue_task() for managed_node1/include_tasks 15794 1726882644.84588: worker is 1 (out of 1 available) 15794 1726882644.84604: exiting _queue_task() for managed_node1/include_tasks 15794 1726882644.84620: done queuing things up, now waiting for results queue to drain 15794 1726882644.84621: waiting for pending results... 15794 1726882644.85355: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15794 1726882644.85857: in run() - task 0affe814-3a2d-94e5-e48f-00000000005c 15794 1726882644.85861: variable 'ansible_search_path' from source: unknown 15794 1726882644.85863: variable 'ansible_search_path' from source: unknown 15794 1726882644.85866: calling self._execute() 15794 1726882644.85869: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882644.85872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882644.85875: variable 'omit' from source: magic vars 15794 1726882644.86783: variable 'ansible_distribution_major_version' from source: facts 15794 1726882644.86805: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882644.86831: _execute() done 15794 1726882644.86894: dumping result to json 15794 1726882644.86905: done dumping result, returning 15794 1726882644.86917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-94e5-e48f-00000000005c] 15794 1726882644.86939: sending task result for task 0affe814-3a2d-94e5-e48f-00000000005c 15794 1726882644.87096: no more pending results, returning what we have 15794 1726882644.87104: in VariableManager get_vars() 15794 1726882644.87158: Calling all_inventory to load vars for managed_node1 15794 1726882644.87162: Calling groups_inventory to load vars for managed_node1 15794 1726882644.87165: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882644.87182: Calling all_plugins_play to load vars for managed_node1 15794 1726882644.87186: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882644.87189: Calling groups_plugins_play to load vars for managed_node1 15794 1726882644.87851: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000005c 15794 1726882644.87856: WORKER PROCESS EXITING 15794 1726882644.91866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882644.97607: done with get_vars() 15794 1726882644.97725: variable 'ansible_search_path' from source: unknown 15794 1726882644.97726: variable 'ansible_search_path' from source: unknown 15794 1726882644.97763: we have included files to process 15794 1726882644.97765: generating all_blocks data 15794 1726882644.97766: done generating all_blocks data 15794 1726882644.97767: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882644.97768: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882644.97771: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15794 1726882644.98649: done processing included file 15794 1726882644.98651: iterating over new_blocks loaded from include file 15794 1726882644.98654: in VariableManager get_vars() 15794 1726882644.98687: done with get_vars() 15794 1726882644.98694: filtering new block on tags 15794 1726882644.98714: done filtering new block on tags 15794 1726882644.98717: in VariableManager get_vars() 15794 1726882644.98746: done with get_vars() 15794 1726882644.98749: filtering new block on tags 15794 1726882644.98774: done filtering new block on tags 15794 1726882644.98777: in VariableManager get_vars() 15794 1726882644.98813: done with get_vars() 15794 1726882644.98816: filtering new block on tags 15794 1726882644.98840: done filtering new block on tags 15794 1726882644.98842: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15794 1726882644.98849: extending task lists for all hosts with included blocks 15794 1726882644.99423: done extending task lists 15794 1726882644.99424: done processing included files 15794 1726882644.99425: results queue empty 15794 1726882644.99426: checking for any_errors_fatal 15794 1726882644.99428: done checking for any_errors_fatal 15794 1726882644.99429: checking for max_fail_percentage 15794 1726882644.99431: done checking for max_fail_percentage 15794 1726882644.99432: checking to see if all hosts have failed and the running result is not ok 15794 1726882644.99433: done checking to see if all hosts have failed 15794 1726882644.99435: getting the remaining hosts for this loop 15794 1726882644.99437: done getting the remaining hosts for this loop 15794 1726882644.99440: getting the next task for host managed_node1 15794 1726882644.99445: done getting next task for host managed_node1 15794 1726882644.99448: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15794 1726882644.99451: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882644.99467: getting variables 15794 1726882644.99469: in VariableManager get_vars() 15794 1726882644.99504: Calling all_inventory to load vars for managed_node1 15794 1726882644.99507: Calling groups_inventory to load vars for managed_node1 15794 1726882644.99510: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882644.99517: Calling all_plugins_play to load vars for managed_node1 15794 1726882644.99521: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882644.99525: Calling groups_plugins_play to load vars for managed_node1 15794 1726882645.01694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882645.05701: done with get_vars() 15794 1726882645.05742: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:25 -0400 (0:00:00.220) 0:00:42.616 ****** 15794 1726882645.05844: entering _queue_task() for managed_node1/setup 15794 1726882645.06337: worker is 1 (out of 1 available) 15794 1726882645.06350: exiting _queue_task() for managed_node1/setup 15794 1726882645.06438: done queuing things up, now waiting for results queue to drain 15794 1726882645.06443: waiting for pending results... 15794 1726882645.06698: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15794 1726882645.06968: in run() - task 0affe814-3a2d-94e5-e48f-000000000458 15794 1726882645.06995: variable 'ansible_search_path' from source: unknown 15794 1726882645.07010: variable 'ansible_search_path' from source: unknown 15794 1726882645.07066: calling self._execute() 15794 1726882645.07190: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882645.07228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882645.07232: variable 'omit' from source: magic vars 15794 1726882645.07705: variable 'ansible_distribution_major_version' from source: facts 15794 1726882645.07725: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882645.08103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882645.10916: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882645.11017: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882645.11086: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882645.11180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882645.11184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882645.11325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882645.11354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882645.11396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882645.11466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882645.11492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882645.11587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882645.12044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882645.12047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882645.12050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882645.12052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882645.12281: variable '__network_required_facts' from source: role '' defaults 15794 1726882645.12388: variable 'ansible_facts' from source: unknown 15794 1726882645.13933: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15794 1726882645.13946: when evaluation is False, skipping this task 15794 1726882645.13955: _execute() done 15794 1726882645.13962: dumping result to json 15794 1726882645.13970: done dumping result, returning 15794 1726882645.13987: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-94e5-e48f-000000000458] 15794 1726882645.14007: sending task result for task 0affe814-3a2d-94e5-e48f-000000000458 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882645.14185: no more pending results, returning what we have 15794 1726882645.14191: results queue empty 15794 1726882645.14192: checking for any_errors_fatal 15794 1726882645.14194: done checking for any_errors_fatal 15794 1726882645.14195: checking for max_fail_percentage 15794 1726882645.14197: done checking for max_fail_percentage 15794 1726882645.14199: checking to see if all hosts have failed and the running result is not ok 15794 1726882645.14200: done checking to see if all hosts have failed 15794 1726882645.14201: getting the remaining hosts for this loop 15794 1726882645.14203: done getting the remaining hosts for this loop 15794 1726882645.14208: getting the next task for host managed_node1 15794 1726882645.14225: done getting next task for host managed_node1 15794 1726882645.14230: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15794 1726882645.14236: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882645.14253: getting variables 15794 1726882645.14255: in VariableManager get_vars() 15794 1726882645.14306: Calling all_inventory to load vars for managed_node1 15794 1726882645.14309: Calling groups_inventory to load vars for managed_node1 15794 1726882645.14312: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882645.14441: Calling all_plugins_play to load vars for managed_node1 15794 1726882645.14446: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882645.14451: Calling groups_plugins_play to load vars for managed_node1 15794 1726882645.15099: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000458 15794 1726882645.15103: WORKER PROCESS EXITING 15794 1726882645.17492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882645.21804: done with get_vars() 15794 1726882645.21852: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:25 -0400 (0:00:00.163) 0:00:42.779 ****** 15794 1726882645.22183: entering _queue_task() for managed_node1/stat 15794 1726882645.22994: worker is 1 (out of 1 available) 15794 1726882645.23008: exiting _queue_task() for managed_node1/stat 15794 1726882645.23101: done queuing things up, now waiting for results queue to drain 15794 1726882645.23103: waiting for pending results... 15794 1726882645.23489: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15794 1726882645.23741: in run() - task 0affe814-3a2d-94e5-e48f-00000000045a 15794 1726882645.23745: variable 'ansible_search_path' from source: unknown 15794 1726882645.23749: variable 'ansible_search_path' from source: unknown 15794 1726882645.23752: calling self._execute() 15794 1726882645.23859: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882645.23874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882645.23892: variable 'omit' from source: magic vars 15794 1726882645.24342: variable 'ansible_distribution_major_version' from source: facts 15794 1726882645.24361: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882645.24574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882645.24903: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882645.24963: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882645.25010: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882645.25139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882645.25159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882645.25197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882645.25240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882645.25280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882645.25394: variable '__network_is_ostree' from source: set_fact 15794 1726882645.25408: Evaluated conditional (not __network_is_ostree is defined): False 15794 1726882645.25418: when evaluation is False, skipping this task 15794 1726882645.25427: _execute() done 15794 1726882645.25438: dumping result to json 15794 1726882645.25448: done dumping result, returning 15794 1726882645.25461: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-94e5-e48f-00000000045a] 15794 1726882645.25473: sending task result for task 0affe814-3a2d-94e5-e48f-00000000045a skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15794 1726882645.25743: no more pending results, returning what we have 15794 1726882645.25749: results queue empty 15794 1726882645.25751: checking for any_errors_fatal 15794 1726882645.25760: done checking for any_errors_fatal 15794 1726882645.25761: checking for max_fail_percentage 15794 1726882645.25763: done checking for max_fail_percentage 15794 1726882645.25764: checking to see if all hosts have failed and the running result is not ok 15794 1726882645.25765: done checking to see if all hosts have failed 15794 1726882645.25766: getting the remaining hosts for this loop 15794 1726882645.25770: done getting the remaining hosts for this loop 15794 1726882645.25774: getting the next task for host managed_node1 15794 1726882645.25782: done getting next task for host managed_node1 15794 1726882645.25787: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15794 1726882645.25791: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882645.25808: getting variables 15794 1726882645.25810: in VariableManager get_vars() 15794 1726882645.25966: Calling all_inventory to load vars for managed_node1 15794 1726882645.25970: Calling groups_inventory to load vars for managed_node1 15794 1726882645.25974: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882645.25981: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000045a 15794 1726882645.25986: WORKER PROCESS EXITING 15794 1726882645.25997: Calling all_plugins_play to load vars for managed_node1 15794 1726882645.26002: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882645.26006: Calling groups_plugins_play to load vars for managed_node1 15794 1726882645.28465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882645.31264: done with get_vars() 15794 1726882645.31312: done getting variables 15794 1726882645.31387: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:25 -0400 (0:00:00.092) 0:00:42.872 ****** 15794 1726882645.31430: entering _queue_task() for managed_node1/set_fact 15794 1726882645.31817: worker is 1 (out of 1 available) 15794 1726882645.31831: exiting _queue_task() for managed_node1/set_fact 15794 1726882645.31950: done queuing things up, now waiting for results queue to drain 15794 1726882645.31952: waiting for pending results... 15794 1726882645.32167: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15794 1726882645.32333: in run() - task 0affe814-3a2d-94e5-e48f-00000000045b 15794 1726882645.32360: variable 'ansible_search_path' from source: unknown 15794 1726882645.32370: variable 'ansible_search_path' from source: unknown 15794 1726882645.32419: calling self._execute() 15794 1726882645.32541: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882645.32558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882645.32577: variable 'omit' from source: magic vars 15794 1726882645.33027: variable 'ansible_distribution_major_version' from source: facts 15794 1726882645.33047: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882645.33440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882645.33574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882645.33630: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882645.33677: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882645.33718: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882645.33823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882645.33862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882645.33905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882645.33946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882645.34061: variable '__network_is_ostree' from source: set_fact 15794 1726882645.34075: Evaluated conditional (not __network_is_ostree is defined): False 15794 1726882645.34085: when evaluation is False, skipping this task 15794 1726882645.34098: _execute() done 15794 1726882645.34106: dumping result to json 15794 1726882645.34115: done dumping result, returning 15794 1726882645.34127: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-94e5-e48f-00000000045b] 15794 1726882645.34142: sending task result for task 0affe814-3a2d-94e5-e48f-00000000045b skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15794 1726882645.34392: no more pending results, returning what we have 15794 1726882645.34397: results queue empty 15794 1726882645.34399: checking for any_errors_fatal 15794 1726882645.34407: done checking for any_errors_fatal 15794 1726882645.34408: checking for max_fail_percentage 15794 1726882645.34410: done checking for max_fail_percentage 15794 1726882645.34412: checking to see if all hosts have failed and the running result is not ok 15794 1726882645.34413: done checking to see if all hosts have failed 15794 1726882645.34413: getting the remaining hosts for this loop 15794 1726882645.34416: done getting the remaining hosts for this loop 15794 1726882645.34421: getting the next task for host managed_node1 15794 1726882645.34436: done getting next task for host managed_node1 15794 1726882645.34440: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15794 1726882645.34444: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882645.34461: getting variables 15794 1726882645.34463: in VariableManager get_vars() 15794 1726882645.34509: Calling all_inventory to load vars for managed_node1 15794 1726882645.34513: Calling groups_inventory to load vars for managed_node1 15794 1726882645.34517: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882645.34529: Calling all_plugins_play to load vars for managed_node1 15794 1726882645.34736: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882645.34744: Calling groups_plugins_play to load vars for managed_node1 15794 1726882645.35451: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000045b 15794 1726882645.35454: WORKER PROCESS EXITING 15794 1726882645.37006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882645.40056: done with get_vars() 15794 1726882645.40096: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:25 -0400 (0:00:00.087) 0:00:42.960 ****** 15794 1726882645.40214: entering _queue_task() for managed_node1/service_facts 15794 1726882645.40677: worker is 1 (out of 1 available) 15794 1726882645.40691: exiting _queue_task() for managed_node1/service_facts 15794 1726882645.40703: done queuing things up, now waiting for results queue to drain 15794 1726882645.40705: waiting for pending results... 15794 1726882645.40963: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15794 1726882645.41136: in run() - task 0affe814-3a2d-94e5-e48f-00000000045d 15794 1726882645.41159: variable 'ansible_search_path' from source: unknown 15794 1726882645.41168: variable 'ansible_search_path' from source: unknown 15794 1726882645.41221: calling self._execute() 15794 1726882645.41344: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882645.41359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882645.41377: variable 'omit' from source: magic vars 15794 1726882645.41840: variable 'ansible_distribution_major_version' from source: facts 15794 1726882645.41859: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882645.41872: variable 'omit' from source: magic vars 15794 1726882645.41948: variable 'omit' from source: magic vars 15794 1726882645.42004: variable 'omit' from source: magic vars 15794 1726882645.42054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882645.42113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882645.42287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882645.42540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882645.42544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882645.42547: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882645.42549: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882645.42552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882645.42732: Set connection var ansible_connection to ssh 15794 1726882645.42769: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882645.42784: Set connection var ansible_pipelining to False 15794 1726882645.42799: Set connection var ansible_shell_executable to /bin/sh 15794 1726882645.42807: Set connection var ansible_shell_type to sh 15794 1726882645.42821: Set connection var ansible_timeout to 10 15794 1726882645.42861: variable 'ansible_shell_executable' from source: unknown 15794 1726882645.42869: variable 'ansible_connection' from source: unknown 15794 1726882645.42878: variable 'ansible_module_compression' from source: unknown 15794 1726882645.42888: variable 'ansible_shell_type' from source: unknown 15794 1726882645.42902: variable 'ansible_shell_executable' from source: unknown 15794 1726882645.42911: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882645.42921: variable 'ansible_pipelining' from source: unknown 15794 1726882645.42929: variable 'ansible_timeout' from source: unknown 15794 1726882645.42943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882645.43179: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882645.43227: variable 'omit' from source: magic vars 15794 1726882645.43231: starting attempt loop 15794 1726882645.43235: running the handler 15794 1726882645.43244: _low_level_execute_command(): starting 15794 1726882645.43259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882645.44070: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882645.44142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882645.44193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882645.44300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882645.46027: stdout chunk (state=3): >>>/root <<< 15794 1726882645.46150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882645.46323: stderr chunk (state=3): >>><<< 15794 1726882645.46327: stdout chunk (state=3): >>><<< 15794 1726882645.46330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882645.46332: _low_level_execute_command(): starting 15794 1726882645.46336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556 `" && echo ansible-tmp-1726882645.462575-17421-197625316078556="` echo /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556 `" ) && sleep 0' 15794 1726882645.47347: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882645.47352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882645.47482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882645.47667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882645.49640: stdout chunk (state=3): >>>ansible-tmp-1726882645.462575-17421-197625316078556=/root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556 <<< 15794 1726882645.50143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882645.50147: stdout chunk (state=3): >>><<< 15794 1726882645.50149: stderr chunk (state=3): >>><<< 15794 1726882645.50152: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882645.462575-17421-197625316078556=/root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882645.50155: variable 'ansible_module_compression' from source: unknown 15794 1726882645.50157: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15794 1726882645.50160: variable 'ansible_facts' from source: unknown 15794 1726882645.50380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py 15794 1726882645.50769: Sending initial data 15794 1726882645.50779: Sent initial data (161 bytes) 15794 1726882645.51859: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882645.51874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882645.51887: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882645.52069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882645.52139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882645.52158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882645.52215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882645.53822: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882645.53910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882645.53978: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpw8w97a8h /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py <<< 15794 1726882645.53982: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py" <<< 15794 1726882645.54027: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpw8w97a8h" to remote "/root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py" <<< 15794 1726882645.56120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882645.56292: stderr chunk (state=3): >>><<< 15794 1726882645.56303: stdout chunk (state=3): >>><<< 15794 1726882645.56332: done transferring module to remote 15794 1726882645.56525: _low_level_execute_command(): starting 15794 1726882645.56530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/ /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py && sleep 0' 15794 1726882645.57666: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882645.57718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882645.57730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882645.57749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882645.57764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882645.57772: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882645.57786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882645.57856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882645.58050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882645.58057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882645.58144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882645.60079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882645.60367: stderr chunk (state=3): >>><<< 15794 1726882645.60371: stdout chunk (state=3): >>><<< 15794 1726882645.60540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882645.60543: _low_level_execute_command(): starting 15794 1726882645.60547: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/AnsiballZ_service_facts.py && sleep 0' 15794 1726882645.61485: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882645.61501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882645.61513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882645.61786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882645.61847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882645.61870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882645.61947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882647.53794: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 15794 1726882647.53824: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "sta<<< 15794 1726882647.53842: stdout chunk (state=3): >>>te": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": <<< 15794 1726882647.53864: stdout chunk (state=3): >>>"inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"},<<< 15794 1726882647.53886: stdout chunk (state=3): >>> "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "<<< 15794 1726882647.53895: stdout chunk (state=3): >>>systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15794 1726882647.55450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882647.55512: stderr chunk (state=3): >>><<< 15794 1726882647.55515: stdout chunk (state=3): >>><<< 15794 1726882647.55550: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882647.56229: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882647.56242: _low_level_execute_command(): starting 15794 1726882647.56248: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882645.462575-17421-197625316078556/ > /dev/null 2>&1 && sleep 0' 15794 1726882647.56696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882647.56706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882647.56731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.56737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.56793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882647.56801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882647.56861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882647.58749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882647.58798: stderr chunk (state=3): >>><<< 15794 1726882647.58801: stdout chunk (state=3): >>><<< 15794 1726882647.58815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882647.58822: handler run complete 15794 1726882647.58987: variable 'ansible_facts' from source: unknown 15794 1726882647.59124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882647.59562: variable 'ansible_facts' from source: unknown 15794 1726882647.59690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882647.59888: attempt loop complete, returning result 15794 1726882647.59895: _execute() done 15794 1726882647.59898: dumping result to json 15794 1726882647.59948: done dumping result, returning 15794 1726882647.59957: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-94e5-e48f-00000000045d] 15794 1726882647.59962: sending task result for task 0affe814-3a2d-94e5-e48f-00000000045d ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882647.60770: no more pending results, returning what we have 15794 1726882647.60773: results queue empty 15794 1726882647.60774: checking for any_errors_fatal 15794 1726882647.60778: done checking for any_errors_fatal 15794 1726882647.60779: checking for max_fail_percentage 15794 1726882647.60781: done checking for max_fail_percentage 15794 1726882647.60782: checking to see if all hosts have failed and the running result is not ok 15794 1726882647.60783: done checking to see if all hosts have failed 15794 1726882647.60783: getting the remaining hosts for this loop 15794 1726882647.60785: done getting the remaining hosts for this loop 15794 1726882647.60788: getting the next task for host managed_node1 15794 1726882647.60794: done getting next task for host managed_node1 15794 1726882647.60797: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15794 1726882647.60800: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882647.60810: getting variables 15794 1726882647.60811: in VariableManager get_vars() 15794 1726882647.60844: Calling all_inventory to load vars for managed_node1 15794 1726882647.60846: Calling groups_inventory to load vars for managed_node1 15794 1726882647.60848: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882647.60857: Calling all_plugins_play to load vars for managed_node1 15794 1726882647.60861: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882647.60864: Calling groups_plugins_play to load vars for managed_node1 15794 1726882647.61383: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000045d 15794 1726882647.61387: WORKER PROCESS EXITING 15794 1726882647.62048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882647.64172: done with get_vars() 15794 1726882647.64213: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:27 -0400 (0:00:02.240) 0:00:45.200 ****** 15794 1726882647.64299: entering _queue_task() for managed_node1/package_facts 15794 1726882647.64548: worker is 1 (out of 1 available) 15794 1726882647.64562: exiting _queue_task() for managed_node1/package_facts 15794 1726882647.64576: done queuing things up, now waiting for results queue to drain 15794 1726882647.64578: waiting for pending results... 15794 1726882647.64766: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15794 1726882647.64868: in run() - task 0affe814-3a2d-94e5-e48f-00000000045e 15794 1726882647.64883: variable 'ansible_search_path' from source: unknown 15794 1726882647.64888: variable 'ansible_search_path' from source: unknown 15794 1726882647.64922: calling self._execute() 15794 1726882647.65001: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882647.65008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882647.65020: variable 'omit' from source: magic vars 15794 1726882647.65539: variable 'ansible_distribution_major_version' from source: facts 15794 1726882647.65544: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882647.65547: variable 'omit' from source: magic vars 15794 1726882647.65549: variable 'omit' from source: magic vars 15794 1726882647.65551: variable 'omit' from source: magic vars 15794 1726882647.65554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882647.65601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882647.65619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882647.65637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882647.65652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882647.65683: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882647.65686: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882647.65689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882647.65773: Set connection var ansible_connection to ssh 15794 1726882647.65783: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882647.65789: Set connection var ansible_pipelining to False 15794 1726882647.65796: Set connection var ansible_shell_executable to /bin/sh 15794 1726882647.65799: Set connection var ansible_shell_type to sh 15794 1726882647.65808: Set connection var ansible_timeout to 10 15794 1726882647.65832: variable 'ansible_shell_executable' from source: unknown 15794 1726882647.65837: variable 'ansible_connection' from source: unknown 15794 1726882647.65841: variable 'ansible_module_compression' from source: unknown 15794 1726882647.65843: variable 'ansible_shell_type' from source: unknown 15794 1726882647.65848: variable 'ansible_shell_executable' from source: unknown 15794 1726882647.65851: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882647.65856: variable 'ansible_pipelining' from source: unknown 15794 1726882647.65859: variable 'ansible_timeout' from source: unknown 15794 1726882647.65869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882647.66038: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882647.66048: variable 'omit' from source: magic vars 15794 1726882647.66054: starting attempt loop 15794 1726882647.66057: running the handler 15794 1726882647.66069: _low_level_execute_command(): starting 15794 1726882647.66077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882647.66792: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.66796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882647.66820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882647.66894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882647.68602: stdout chunk (state=3): >>>/root <<< 15794 1726882647.68717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882647.68766: stderr chunk (state=3): >>><<< 15794 1726882647.68770: stdout chunk (state=3): >>><<< 15794 1726882647.68789: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882647.68804: _low_level_execute_command(): starting 15794 1726882647.68813: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201 `" && echo ansible-tmp-1726882647.6879106-17518-79664596833201="` echo /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201 `" ) && sleep 0' 15794 1726882647.69275: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882647.69279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.69282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882647.69295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.69346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882647.69353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882647.69416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882647.71392: stdout chunk (state=3): >>>ansible-tmp-1726882647.6879106-17518-79664596833201=/root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201 <<< 15794 1726882647.71517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882647.71566: stderr chunk (state=3): >>><<< 15794 1726882647.71570: stdout chunk (state=3): >>><<< 15794 1726882647.71585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882647.6879106-17518-79664596833201=/root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882647.71627: variable 'ansible_module_compression' from source: unknown 15794 1726882647.71672: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15794 1726882647.71731: variable 'ansible_facts' from source: unknown 15794 1726882647.71839: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py 15794 1726882647.71960: Sending initial data 15794 1726882647.71964: Sent initial data (161 bytes) 15794 1726882647.72434: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882647.72437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882647.72440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882647.72444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882647.72446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.72498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882647.72503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882647.72562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882647.74146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882647.74204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882647.74261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpqie9cemw /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py <<< 15794 1726882647.74266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py" <<< 15794 1726882647.74314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpqie9cemw" to remote "/root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py" <<< 15794 1726882647.76074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882647.76130: stderr chunk (state=3): >>><<< 15794 1726882647.76136: stdout chunk (state=3): >>><<< 15794 1726882647.76159: done transferring module to remote 15794 1726882647.76169: _low_level_execute_command(): starting 15794 1726882647.76175: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/ /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py && sleep 0' 15794 1726882647.76612: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882647.76615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.76618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882647.76621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.76669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882647.76672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882647.76732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882647.78538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882647.78582: stderr chunk (state=3): >>><<< 15794 1726882647.78586: stdout chunk (state=3): >>><<< 15794 1726882647.78596: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882647.78599: _low_level_execute_command(): starting 15794 1726882647.78605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/AnsiballZ_package_facts.py && sleep 0' 15794 1726882647.79038: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882647.79042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882647.79044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882647.79047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882647.79049: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882647.79099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882647.79103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882647.79169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882648.42673: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 15794 1726882648.42704: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15794 1726882648.44457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882648.44658: stderr chunk (state=3): >>><<< 15794 1726882648.44661: stdout chunk (state=3): >>><<< 15794 1726882648.44694: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882648.53342: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882648.53346: _low_level_execute_command(): starting 15794 1726882648.53349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882647.6879106-17518-79664596833201/ > /dev/null 2>&1 && sleep 0' 15794 1726882648.54966: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882648.54984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882648.54999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882648.55020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882648.55048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882648.55064: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882648.55156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882648.55191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882648.55230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882648.55307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882648.55341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882648.57569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882648.57786: stderr chunk (state=3): >>><<< 15794 1726882648.57789: stdout chunk (state=3): >>><<< 15794 1726882648.57793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882648.57795: handler run complete 15794 1726882648.60776: variable 'ansible_facts' from source: unknown 15794 1726882648.62406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882648.69988: variable 'ansible_facts' from source: unknown 15794 1726882648.72173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882648.75105: attempt loop complete, returning result 15794 1726882648.75136: _execute() done 15794 1726882648.75146: dumping result to json 15794 1726882648.75809: done dumping result, returning 15794 1726882648.75852: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-94e5-e48f-00000000045e] 15794 1726882648.75885: sending task result for task 0affe814-3a2d-94e5-e48f-00000000045e 15794 1726882648.83294: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000045e 15794 1726882648.83299: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882648.83466: no more pending results, returning what we have 15794 1726882648.83469: results queue empty 15794 1726882648.83471: checking for any_errors_fatal 15794 1726882648.83477: done checking for any_errors_fatal 15794 1726882648.83478: checking for max_fail_percentage 15794 1726882648.83479: done checking for max_fail_percentage 15794 1726882648.83480: checking to see if all hosts have failed and the running result is not ok 15794 1726882648.83481: done checking to see if all hosts have failed 15794 1726882648.83482: getting the remaining hosts for this loop 15794 1726882648.83484: done getting the remaining hosts for this loop 15794 1726882648.83488: getting the next task for host managed_node1 15794 1726882648.83495: done getting next task for host managed_node1 15794 1726882648.83500: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15794 1726882648.83502: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882648.83513: getting variables 15794 1726882648.83515: in VariableManager get_vars() 15794 1726882648.83755: Calling all_inventory to load vars for managed_node1 15794 1726882648.83758: Calling groups_inventory to load vars for managed_node1 15794 1726882648.83762: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882648.83772: Calling all_plugins_play to load vars for managed_node1 15794 1726882648.83776: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882648.83780: Calling groups_plugins_play to load vars for managed_node1 15794 1726882648.88847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882648.94082: done with get_vars() 15794 1726882648.94138: done getting variables 15794 1726882648.94218: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:28 -0400 (0:00:01.299) 0:00:46.500 ****** 15794 1726882648.94258: entering _queue_task() for managed_node1/debug 15794 1726882648.94862: worker is 1 (out of 1 available) 15794 1726882648.94872: exiting _queue_task() for managed_node1/debug 15794 1726882648.94886: done queuing things up, now waiting for results queue to drain 15794 1726882648.94889: waiting for pending results... 15794 1726882648.95009: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15794 1726882648.95158: in run() - task 0affe814-3a2d-94e5-e48f-00000000005d 15794 1726882648.95185: variable 'ansible_search_path' from source: unknown 15794 1726882648.95196: variable 'ansible_search_path' from source: unknown 15794 1726882648.95253: calling self._execute() 15794 1726882648.95376: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882648.95395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882648.95412: variable 'omit' from source: magic vars 15794 1726882648.95996: variable 'ansible_distribution_major_version' from source: facts 15794 1726882648.96001: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882648.96005: variable 'omit' from source: magic vars 15794 1726882648.96008: variable 'omit' from source: magic vars 15794 1726882648.96138: variable 'network_provider' from source: set_fact 15794 1726882648.96215: variable 'omit' from source: magic vars 15794 1726882648.96225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882648.96278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882648.96311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882648.96349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882648.96368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882648.96412: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882648.96433: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882648.96454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882648.96585: Set connection var ansible_connection to ssh 15794 1726882648.96640: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882648.96643: Set connection var ansible_pipelining to False 15794 1726882648.96650: Set connection var ansible_shell_executable to /bin/sh 15794 1726882648.96653: Set connection var ansible_shell_type to sh 15794 1726882648.96655: Set connection var ansible_timeout to 10 15794 1726882648.96694: variable 'ansible_shell_executable' from source: unknown 15794 1726882648.96704: variable 'ansible_connection' from source: unknown 15794 1726882648.96713: variable 'ansible_module_compression' from source: unknown 15794 1726882648.96720: variable 'ansible_shell_type' from source: unknown 15794 1726882648.96740: variable 'ansible_shell_executable' from source: unknown 15794 1726882648.96743: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882648.96759: variable 'ansible_pipelining' from source: unknown 15794 1726882648.96761: variable 'ansible_timeout' from source: unknown 15794 1726882648.96769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882648.96980: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882648.96997: variable 'omit' from source: magic vars 15794 1726882648.97041: starting attempt loop 15794 1726882648.97044: running the handler 15794 1726882648.97083: handler run complete 15794 1726882648.97116: attempt loop complete, returning result 15794 1726882648.97126: _execute() done 15794 1726882648.97199: dumping result to json 15794 1726882648.97203: done dumping result, returning 15794 1726882648.97205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-94e5-e48f-00000000005d] 15794 1726882648.97210: sending task result for task 0affe814-3a2d-94e5-e48f-00000000005d 15794 1726882648.97290: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000005d ok: [managed_node1] => {} MSG: Using network provider: nm 15794 1726882648.97381: no more pending results, returning what we have 15794 1726882648.97386: results queue empty 15794 1726882648.97388: checking for any_errors_fatal 15794 1726882648.97403: done checking for any_errors_fatal 15794 1726882648.97404: checking for max_fail_percentage 15794 1726882648.97407: done checking for max_fail_percentage 15794 1726882648.97408: checking to see if all hosts have failed and the running result is not ok 15794 1726882648.97409: done checking to see if all hosts have failed 15794 1726882648.97410: getting the remaining hosts for this loop 15794 1726882648.97412: done getting the remaining hosts for this loop 15794 1726882648.97418: getting the next task for host managed_node1 15794 1726882648.97426: done getting next task for host managed_node1 15794 1726882648.97431: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15794 1726882648.97550: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882648.97564: getting variables 15794 1726882648.97566: in VariableManager get_vars() 15794 1726882648.97615: Calling all_inventory to load vars for managed_node1 15794 1726882648.97619: Calling groups_inventory to load vars for managed_node1 15794 1726882648.97622: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882648.97740: Calling all_plugins_play to load vars for managed_node1 15794 1726882648.97746: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882648.97751: Calling groups_plugins_play to load vars for managed_node1 15794 1726882648.98281: WORKER PROCESS EXITING 15794 1726882649.00187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882649.03246: done with get_vars() 15794 1726882649.03291: done getting variables 15794 1726882649.03373: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:29 -0400 (0:00:00.091) 0:00:46.592 ****** 15794 1726882649.03411: entering _queue_task() for managed_node1/fail 15794 1726882649.03789: worker is 1 (out of 1 available) 15794 1726882649.03802: exiting _queue_task() for managed_node1/fail 15794 1726882649.03816: done queuing things up, now waiting for results queue to drain 15794 1726882649.03817: waiting for pending results... 15794 1726882649.04140: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15794 1726882649.04286: in run() - task 0affe814-3a2d-94e5-e48f-00000000005e 15794 1726882649.04308: variable 'ansible_search_path' from source: unknown 15794 1726882649.04316: variable 'ansible_search_path' from source: unknown 15794 1726882649.04367: calling self._execute() 15794 1726882649.04496: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882649.04510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882649.04529: variable 'omit' from source: magic vars 15794 1726882649.05444: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.05448: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882649.05755: variable 'network_state' from source: role '' defaults 15794 1726882649.05777: Evaluated conditional (network_state != {}): False 15794 1726882649.05790: when evaluation is False, skipping this task 15794 1726882649.05800: _execute() done 15794 1726882649.05881: dumping result to json 15794 1726882649.05885: done dumping result, returning 15794 1726882649.05888: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-94e5-e48f-00000000005e] 15794 1726882649.05891: sending task result for task 0affe814-3a2d-94e5-e48f-00000000005e 15794 1726882649.06314: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000005e 15794 1726882649.06317: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882649.06382: no more pending results, returning what we have 15794 1726882649.06388: results queue empty 15794 1726882649.06390: checking for any_errors_fatal 15794 1726882649.06398: done checking for any_errors_fatal 15794 1726882649.06399: checking for max_fail_percentage 15794 1726882649.06401: done checking for max_fail_percentage 15794 1726882649.06403: checking to see if all hosts have failed and the running result is not ok 15794 1726882649.06404: done checking to see if all hosts have failed 15794 1726882649.06405: getting the remaining hosts for this loop 15794 1726882649.06407: done getting the remaining hosts for this loop 15794 1726882649.06412: getting the next task for host managed_node1 15794 1726882649.06420: done getting next task for host managed_node1 15794 1726882649.06425: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15794 1726882649.06431: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882649.06450: getting variables 15794 1726882649.06452: in VariableManager get_vars() 15794 1726882649.06502: Calling all_inventory to load vars for managed_node1 15794 1726882649.06506: Calling groups_inventory to load vars for managed_node1 15794 1726882649.06510: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882649.06525: Calling all_plugins_play to load vars for managed_node1 15794 1726882649.06529: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882649.06936: Calling groups_plugins_play to load vars for managed_node1 15794 1726882649.11603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882649.18417: done with get_vars() 15794 1726882649.18469: done getting variables 15794 1726882649.18746: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:29 -0400 (0:00:00.153) 0:00:46.745 ****** 15794 1726882649.18787: entering _queue_task() for managed_node1/fail 15794 1726882649.19369: worker is 1 (out of 1 available) 15794 1726882649.19386: exiting _queue_task() for managed_node1/fail 15794 1726882649.19398: done queuing things up, now waiting for results queue to drain 15794 1726882649.19399: waiting for pending results... 15794 1726882649.20185: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15794 1726882649.20302: in run() - task 0affe814-3a2d-94e5-e48f-00000000005f 15794 1726882649.20540: variable 'ansible_search_path' from source: unknown 15794 1726882649.20544: variable 'ansible_search_path' from source: unknown 15794 1726882649.20548: calling self._execute() 15794 1726882649.20666: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882649.20764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882649.20785: variable 'omit' from source: magic vars 15794 1726882649.21842: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.21846: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882649.22040: variable 'network_state' from source: role '' defaults 15794 1726882649.22154: Evaluated conditional (network_state != {}): False 15794 1726882649.22165: when evaluation is False, skipping this task 15794 1726882649.22239: _execute() done 15794 1726882649.22242: dumping result to json 15794 1726882649.22245: done dumping result, returning 15794 1726882649.22248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-94e5-e48f-00000000005f] 15794 1726882649.22251: sending task result for task 0affe814-3a2d-94e5-e48f-00000000005f 15794 1726882649.22540: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000005f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882649.22603: no more pending results, returning what we have 15794 1726882649.22609: results queue empty 15794 1726882649.22610: checking for any_errors_fatal 15794 1726882649.22620: done checking for any_errors_fatal 15794 1726882649.22621: checking for max_fail_percentage 15794 1726882649.22624: done checking for max_fail_percentage 15794 1726882649.22625: checking to see if all hosts have failed and the running result is not ok 15794 1726882649.22626: done checking to see if all hosts have failed 15794 1726882649.22627: getting the remaining hosts for this loop 15794 1726882649.22629: done getting the remaining hosts for this loop 15794 1726882649.22637: getting the next task for host managed_node1 15794 1726882649.22644: done getting next task for host managed_node1 15794 1726882649.22650: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15794 1726882649.22653: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882649.22671: getting variables 15794 1726882649.22673: in VariableManager get_vars() 15794 1726882649.22722: Calling all_inventory to load vars for managed_node1 15794 1726882649.22726: Calling groups_inventory to load vars for managed_node1 15794 1726882649.22730: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882649.22950: Calling all_plugins_play to load vars for managed_node1 15794 1726882649.22955: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882649.22962: WORKER PROCESS EXITING 15794 1726882649.22967: Calling groups_plugins_play to load vars for managed_node1 15794 1726882649.39981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882649.47406: done with get_vars() 15794 1726882649.47589: done getting variables 15794 1726882649.47776: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:29 -0400 (0:00:00.290) 0:00:47.036 ****** 15794 1726882649.47818: entering _queue_task() for managed_node1/fail 15794 1726882649.48696: worker is 1 (out of 1 available) 15794 1726882649.48708: exiting _queue_task() for managed_node1/fail 15794 1726882649.48719: done queuing things up, now waiting for results queue to drain 15794 1726882649.48721: waiting for pending results... 15794 1726882649.49254: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15794 1726882649.49506: in run() - task 0affe814-3a2d-94e5-e48f-000000000060 15794 1726882649.49528: variable 'ansible_search_path' from source: unknown 15794 1726882649.49539: variable 'ansible_search_path' from source: unknown 15794 1726882649.49590: calling self._execute() 15794 1726882649.49953: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882649.50139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882649.50143: variable 'omit' from source: magic vars 15794 1726882649.50828: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.50849: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882649.51660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882649.57990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882649.58146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882649.58342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882649.58437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882649.58532: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882649.58703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882649.58801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882649.58972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882649.59029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882649.59104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882649.59339: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.59460: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15794 1726882649.59809: variable 'ansible_distribution' from source: facts 15794 1726882649.59813: variable '__network_rh_distros' from source: role '' defaults 15794 1726882649.59826: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15794 1726882649.59858: when evaluation is False, skipping this task 15794 1726882649.59866: _execute() done 15794 1726882649.59873: dumping result to json 15794 1726882649.59898: done dumping result, returning 15794 1726882649.59928: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-94e5-e48f-000000000060] 15794 1726882649.60036: sending task result for task 0affe814-3a2d-94e5-e48f-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15794 1726882649.60211: no more pending results, returning what we have 15794 1726882649.60215: results queue empty 15794 1726882649.60216: checking for any_errors_fatal 15794 1726882649.60224: done checking for any_errors_fatal 15794 1726882649.60225: checking for max_fail_percentage 15794 1726882649.60227: done checking for max_fail_percentage 15794 1726882649.60228: checking to see if all hosts have failed and the running result is not ok 15794 1726882649.60229: done checking to see if all hosts have failed 15794 1726882649.60230: getting the remaining hosts for this loop 15794 1726882649.60232: done getting the remaining hosts for this loop 15794 1726882649.60242: getting the next task for host managed_node1 15794 1726882649.60250: done getting next task for host managed_node1 15794 1726882649.60256: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15794 1726882649.60258: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882649.60282: getting variables 15794 1726882649.60285: in VariableManager get_vars() 15794 1726882649.60330: Calling all_inventory to load vars for managed_node1 15794 1726882649.60333: Calling groups_inventory to load vars for managed_node1 15794 1726882649.60571: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882649.60586: Calling all_plugins_play to load vars for managed_node1 15794 1726882649.60590: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882649.60594: Calling groups_plugins_play to load vars for managed_node1 15794 1726882649.61185: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000060 15794 1726882649.61189: WORKER PROCESS EXITING 15794 1726882649.63914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882649.67020: done with get_vars() 15794 1726882649.67063: done getting variables 15794 1726882649.67142: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:29 -0400 (0:00:00.193) 0:00:47.229 ****** 15794 1726882649.67177: entering _queue_task() for managed_node1/dnf 15794 1726882649.67658: worker is 1 (out of 1 available) 15794 1726882649.67672: exiting _queue_task() for managed_node1/dnf 15794 1726882649.67684: done queuing things up, now waiting for results queue to drain 15794 1726882649.67685: waiting for pending results... 15794 1726882649.67938: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15794 1726882649.68081: in run() - task 0affe814-3a2d-94e5-e48f-000000000061 15794 1726882649.68104: variable 'ansible_search_path' from source: unknown 15794 1726882649.68113: variable 'ansible_search_path' from source: unknown 15794 1726882649.68162: calling self._execute() 15794 1726882649.68289: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882649.68304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882649.68321: variable 'omit' from source: magic vars 15794 1726882649.68795: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.68814: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882649.69106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882649.72376: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882649.72471: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882649.72541: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882649.72640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882649.72645: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882649.72725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882649.72871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882649.72875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882649.72878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882649.72901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882649.73051: variable 'ansible_distribution' from source: facts 15794 1726882649.73061: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.73074: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15794 1726882649.73243: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882649.73449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882649.73484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882649.73523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882649.73631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882649.73641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882649.73674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882649.73711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882649.73760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882649.73816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882649.73841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882649.73973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882649.73977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882649.73980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882649.74083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882649.74542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882649.74546: variable 'network_connections' from source: play vars 15794 1726882649.74548: variable 'profile' from source: play vars 15794 1726882649.74729: variable 'profile' from source: play vars 15794 1726882649.74840: variable 'interface' from source: set_fact 15794 1726882649.74930: variable 'interface' from source: set_fact 15794 1726882649.75071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882649.75511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882649.75685: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882649.75788: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882649.75907: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882649.76019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882649.76181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882649.76247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882649.76274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882649.76460: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882649.77248: variable 'network_connections' from source: play vars 15794 1726882649.77261: variable 'profile' from source: play vars 15794 1726882649.77353: variable 'profile' from source: play vars 15794 1726882649.77539: variable 'interface' from source: set_fact 15794 1726882649.77591: variable 'interface' from source: set_fact 15794 1726882649.77652: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882649.77721: when evaluation is False, skipping this task 15794 1726882649.78039: _execute() done 15794 1726882649.78045: dumping result to json 15794 1726882649.78048: done dumping result, returning 15794 1726882649.78051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000061] 15794 1726882649.78053: sending task result for task 0affe814-3a2d-94e5-e48f-000000000061 15794 1726882649.78133: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000061 15794 1726882649.78139: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882649.78203: no more pending results, returning what we have 15794 1726882649.78207: results queue empty 15794 1726882649.78209: checking for any_errors_fatal 15794 1726882649.78218: done checking for any_errors_fatal 15794 1726882649.78219: checking for max_fail_percentage 15794 1726882649.78221: done checking for max_fail_percentage 15794 1726882649.78222: checking to see if all hosts have failed and the running result is not ok 15794 1726882649.78223: done checking to see if all hosts have failed 15794 1726882649.78224: getting the remaining hosts for this loop 15794 1726882649.78227: done getting the remaining hosts for this loop 15794 1726882649.78232: getting the next task for host managed_node1 15794 1726882649.78242: done getting next task for host managed_node1 15794 1726882649.78248: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15794 1726882649.78250: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882649.78269: getting variables 15794 1726882649.78271: in VariableManager get_vars() 15794 1726882649.78316: Calling all_inventory to load vars for managed_node1 15794 1726882649.78319: Calling groups_inventory to load vars for managed_node1 15794 1726882649.78322: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882649.78570: Calling all_plugins_play to load vars for managed_node1 15794 1726882649.78576: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882649.78581: Calling groups_plugins_play to load vars for managed_node1 15794 1726882649.84503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882649.93130: done with get_vars() 15794 1726882649.93304: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15794 1726882649.93731: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:29 -0400 (0:00:00.265) 0:00:47.495 ****** 15794 1726882649.93771: entering _queue_task() for managed_node1/yum 15794 1726882649.94968: worker is 1 (out of 1 available) 15794 1726882649.94987: exiting _queue_task() for managed_node1/yum 15794 1726882649.95001: done queuing things up, now waiting for results queue to drain 15794 1726882649.95002: waiting for pending results... 15794 1726882649.95748: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15794 1726882649.96745: in run() - task 0affe814-3a2d-94e5-e48f-000000000062 15794 1726882649.96750: variable 'ansible_search_path' from source: unknown 15794 1726882649.96754: variable 'ansible_search_path' from source: unknown 15794 1726882649.96757: calling self._execute() 15794 1726882649.96893: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882649.96910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882649.97055: variable 'omit' from source: magic vars 15794 1726882649.98421: variable 'ansible_distribution_major_version' from source: facts 15794 1726882649.98819: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882649.99201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882650.04185: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882650.04295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882650.04353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882650.04407: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882650.04448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882650.04557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.04609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.04647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.04715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.04787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.04872: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.04902: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15794 1726882650.04911: when evaluation is False, skipping this task 15794 1726882650.04926: _execute() done 15794 1726882650.04935: dumping result to json 15794 1726882650.04945: done dumping result, returning 15794 1726882650.04958: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000062] 15794 1726882650.05004: sending task result for task 0affe814-3a2d-94e5-e48f-000000000062 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15794 1726882650.05284: no more pending results, returning what we have 15794 1726882650.05290: results queue empty 15794 1726882650.05292: checking for any_errors_fatal 15794 1726882650.05305: done checking for any_errors_fatal 15794 1726882650.05306: checking for max_fail_percentage 15794 1726882650.05308: done checking for max_fail_percentage 15794 1726882650.05309: checking to see if all hosts have failed and the running result is not ok 15794 1726882650.05311: done checking to see if all hosts have failed 15794 1726882650.05312: getting the remaining hosts for this loop 15794 1726882650.05314: done getting the remaining hosts for this loop 15794 1726882650.05319: getting the next task for host managed_node1 15794 1726882650.05555: done getting next task for host managed_node1 15794 1726882650.05561: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15794 1726882650.05564: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882650.05667: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000062 15794 1726882650.05670: WORKER PROCESS EXITING 15794 1726882650.05684: getting variables 15794 1726882650.05686: in VariableManager get_vars() 15794 1726882650.05733: Calling all_inventory to load vars for managed_node1 15794 1726882650.06041: Calling groups_inventory to load vars for managed_node1 15794 1726882650.06045: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882650.06056: Calling all_plugins_play to load vars for managed_node1 15794 1726882650.06060: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882650.06065: Calling groups_plugins_play to load vars for managed_node1 15794 1726882650.09212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882650.16647: done with get_vars() 15794 1726882650.16918: done getting variables 15794 1726882650.16993: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:30 -0400 (0:00:00.232) 0:00:47.730 ****** 15794 1726882650.17248: entering _queue_task() for managed_node1/fail 15794 1726882650.18418: worker is 1 (out of 1 available) 15794 1726882650.18431: exiting _queue_task() for managed_node1/fail 15794 1726882650.18445: done queuing things up, now waiting for results queue to drain 15794 1726882650.18447: waiting for pending results... 15794 1726882650.19367: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15794 1726882650.19743: in run() - task 0affe814-3a2d-94e5-e48f-000000000063 15794 1726882650.19941: variable 'ansible_search_path' from source: unknown 15794 1726882650.19945: variable 'ansible_search_path' from source: unknown 15794 1726882650.19949: calling self._execute() 15794 1726882650.20208: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.20642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.20646: variable 'omit' from source: magic vars 15794 1726882650.21694: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.22056: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882650.22216: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882650.23443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882650.30447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882650.30662: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882650.30715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882650.30895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882650.30933: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882650.31126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.31219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.31277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.31442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.31467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.31654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.31693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.31761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.31893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.31962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.32021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.32195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.32233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.32397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.32443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.32914: variable 'network_connections' from source: play vars 15794 1726882650.33142: variable 'profile' from source: play vars 15794 1726882650.33341: variable 'profile' from source: play vars 15794 1726882650.33344: variable 'interface' from source: set_fact 15794 1726882650.33440: variable 'interface' from source: set_fact 15794 1726882650.33586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882650.34049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882650.34173: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882650.34219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882650.34380: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882650.34472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882650.34572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882650.34612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.34795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882650.34856: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882650.35542: variable 'network_connections' from source: play vars 15794 1726882650.35762: variable 'profile' from source: play vars 15794 1726882650.35766: variable 'profile' from source: play vars 15794 1726882650.35769: variable 'interface' from source: set_fact 15794 1726882650.36013: variable 'interface' from source: set_fact 15794 1726882650.36051: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882650.36060: when evaluation is False, skipping this task 15794 1726882650.36068: _execute() done 15794 1726882650.36075: dumping result to json 15794 1726882650.36102: done dumping result, returning 15794 1726882650.36257: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000063] 15794 1726882650.36268: sending task result for task 0affe814-3a2d-94e5-e48f-000000000063 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882650.36428: no more pending results, returning what we have 15794 1726882650.36433: results queue empty 15794 1726882650.36434: checking for any_errors_fatal 15794 1726882650.36442: done checking for any_errors_fatal 15794 1726882650.36443: checking for max_fail_percentage 15794 1726882650.36445: done checking for max_fail_percentage 15794 1726882650.36446: checking to see if all hosts have failed and the running result is not ok 15794 1726882650.36447: done checking to see if all hosts have failed 15794 1726882650.36448: getting the remaining hosts for this loop 15794 1726882650.36450: done getting the remaining hosts for this loop 15794 1726882650.36455: getting the next task for host managed_node1 15794 1726882650.36462: done getting next task for host managed_node1 15794 1726882650.36467: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15794 1726882650.36469: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882650.36486: getting variables 15794 1726882650.36490: in VariableManager get_vars() 15794 1726882650.36560: Calling all_inventory to load vars for managed_node1 15794 1726882650.36564: Calling groups_inventory to load vars for managed_node1 15794 1726882650.36567: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882650.36578: Calling all_plugins_play to load vars for managed_node1 15794 1726882650.36582: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882650.36586: Calling groups_plugins_play to load vars for managed_node1 15794 1726882650.37708: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000063 15794 1726882650.37712: WORKER PROCESS EXITING 15794 1726882650.41310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882650.44515: done with get_vars() 15794 1726882650.44559: done getting variables 15794 1726882650.44631: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:30 -0400 (0:00:00.274) 0:00:48.004 ****** 15794 1726882650.44676: entering _queue_task() for managed_node1/package 15794 1726882650.45293: worker is 1 (out of 1 available) 15794 1726882650.45310: exiting _queue_task() for managed_node1/package 15794 1726882650.45323: done queuing things up, now waiting for results queue to drain 15794 1726882650.45325: waiting for pending results... 15794 1726882650.45868: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15794 1726882650.46245: in run() - task 0affe814-3a2d-94e5-e48f-000000000064 15794 1726882650.46249: variable 'ansible_search_path' from source: unknown 15794 1726882650.46252: variable 'ansible_search_path' from source: unknown 15794 1726882650.46297: calling self._execute() 15794 1726882650.46446: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.46453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.46465: variable 'omit' from source: magic vars 15794 1726882650.46873: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.46886: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882650.47062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882650.47305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882650.47347: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882650.47377: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882650.47439: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882650.47538: variable 'network_packages' from source: role '' defaults 15794 1726882650.47633: variable '__network_provider_setup' from source: role '' defaults 15794 1726882650.47645: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882650.47710: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882650.47719: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882650.47774: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882650.47937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882650.49976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882650.50031: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882650.50073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882650.50116: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882650.50148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882650.50240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.50281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.50309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.50392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.50404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.50435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.50465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.50501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.50550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.50579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.50847: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15794 1726882650.50957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.50977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.51000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.51030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.51048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.51123: variable 'ansible_python' from source: facts 15794 1726882650.51147: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15794 1726882650.51216: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882650.51313: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882650.51425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.51447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.51468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.51507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.51517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.51558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.51581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.51603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.51641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.51654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.51924: variable 'network_connections' from source: play vars 15794 1726882650.51928: variable 'profile' from source: play vars 15794 1726882650.51952: variable 'profile' from source: play vars 15794 1726882650.51959: variable 'interface' from source: set_fact 15794 1726882650.52042: variable 'interface' from source: set_fact 15794 1726882650.52126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882650.52167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882650.52197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.52232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882650.52287: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882650.52663: variable 'network_connections' from source: play vars 15794 1726882650.52680: variable 'profile' from source: play vars 15794 1726882650.52845: variable 'profile' from source: play vars 15794 1726882650.52857: variable 'interface' from source: set_fact 15794 1726882650.53139: variable 'interface' from source: set_fact 15794 1726882650.53143: variable '__network_packages_default_wireless' from source: role '' defaults 15794 1726882650.53145: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882650.53526: variable 'network_connections' from source: play vars 15794 1726882650.53530: variable 'profile' from source: play vars 15794 1726882650.53586: variable 'profile' from source: play vars 15794 1726882650.53597: variable 'interface' from source: set_fact 15794 1726882650.53677: variable 'interface' from source: set_fact 15794 1726882650.53704: variable '__network_packages_default_team' from source: role '' defaults 15794 1726882650.53769: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882650.54037: variable 'network_connections' from source: play vars 15794 1726882650.54040: variable 'profile' from source: play vars 15794 1726882650.54098: variable 'profile' from source: play vars 15794 1726882650.54102: variable 'interface' from source: set_fact 15794 1726882650.54188: variable 'interface' from source: set_fact 15794 1726882650.54232: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882650.54290: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882650.54296: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882650.54347: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882650.54538: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15794 1726882650.54942: variable 'network_connections' from source: play vars 15794 1726882650.54947: variable 'profile' from source: play vars 15794 1726882650.54999: variable 'profile' from source: play vars 15794 1726882650.55002: variable 'interface' from source: set_fact 15794 1726882650.55061: variable 'interface' from source: set_fact 15794 1726882650.55070: variable 'ansible_distribution' from source: facts 15794 1726882650.55074: variable '__network_rh_distros' from source: role '' defaults 15794 1726882650.55083: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.55096: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15794 1726882650.55238: variable 'ansible_distribution' from source: facts 15794 1726882650.55242: variable '__network_rh_distros' from source: role '' defaults 15794 1726882650.55248: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.55257: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15794 1726882650.55399: variable 'ansible_distribution' from source: facts 15794 1726882650.55403: variable '__network_rh_distros' from source: role '' defaults 15794 1726882650.55409: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.55440: variable 'network_provider' from source: set_fact 15794 1726882650.55456: variable 'ansible_facts' from source: unknown 15794 1726882650.56160: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15794 1726882650.56163: when evaluation is False, skipping this task 15794 1726882650.56166: _execute() done 15794 1726882650.56168: dumping result to json 15794 1726882650.56170: done dumping result, returning 15794 1726882650.56172: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-94e5-e48f-000000000064] 15794 1726882650.56174: sending task result for task 0affe814-3a2d-94e5-e48f-000000000064 15794 1726882650.56343: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000064 15794 1726882650.56346: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15794 1726882650.56428: no more pending results, returning what we have 15794 1726882650.56432: results queue empty 15794 1726882650.56433: checking for any_errors_fatal 15794 1726882650.56442: done checking for any_errors_fatal 15794 1726882650.56443: checking for max_fail_percentage 15794 1726882650.56446: done checking for max_fail_percentage 15794 1726882650.56446: checking to see if all hosts have failed and the running result is not ok 15794 1726882650.56447: done checking to see if all hosts have failed 15794 1726882650.56448: getting the remaining hosts for this loop 15794 1726882650.56450: done getting the remaining hosts for this loop 15794 1726882650.56453: getting the next task for host managed_node1 15794 1726882650.56461: done getting next task for host managed_node1 15794 1726882650.56469: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15794 1726882650.56471: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882650.56486: getting variables 15794 1726882650.56489: in VariableManager get_vars() 15794 1726882650.56526: Calling all_inventory to load vars for managed_node1 15794 1726882650.56529: Calling groups_inventory to load vars for managed_node1 15794 1726882650.56532: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882650.56610: Calling all_plugins_play to load vars for managed_node1 15794 1726882650.56614: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882650.56619: Calling groups_plugins_play to load vars for managed_node1 15794 1726882650.59515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882650.61983: done with get_vars() 15794 1726882650.62021: done getting variables 15794 1726882650.62091: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:30 -0400 (0:00:00.174) 0:00:48.179 ****** 15794 1726882650.62126: entering _queue_task() for managed_node1/package 15794 1726882650.62435: worker is 1 (out of 1 available) 15794 1726882650.62450: exiting _queue_task() for managed_node1/package 15794 1726882650.62464: done queuing things up, now waiting for results queue to drain 15794 1726882650.62465: waiting for pending results... 15794 1726882650.62675: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15794 1726882650.62855: in run() - task 0affe814-3a2d-94e5-e48f-000000000065 15794 1726882650.62861: variable 'ansible_search_path' from source: unknown 15794 1726882650.62866: variable 'ansible_search_path' from source: unknown 15794 1726882650.62940: calling self._execute() 15794 1726882650.63158: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.63162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.63164: variable 'omit' from source: magic vars 15794 1726882650.63582: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.63620: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882650.63792: variable 'network_state' from source: role '' defaults 15794 1726882650.63824: Evaluated conditional (network_state != {}): False 15794 1726882650.63836: when evaluation is False, skipping this task 15794 1726882650.63844: _execute() done 15794 1726882650.63853: dumping result to json 15794 1726882650.63861: done dumping result, returning 15794 1726882650.63872: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-94e5-e48f-000000000065] 15794 1726882650.63885: sending task result for task 0affe814-3a2d-94e5-e48f-000000000065 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882650.64083: no more pending results, returning what we have 15794 1726882650.64089: results queue empty 15794 1726882650.64091: checking for any_errors_fatal 15794 1726882650.64099: done checking for any_errors_fatal 15794 1726882650.64100: checking for max_fail_percentage 15794 1726882650.64102: done checking for max_fail_percentage 15794 1726882650.64103: checking to see if all hosts have failed and the running result is not ok 15794 1726882650.64104: done checking to see if all hosts have failed 15794 1726882650.64105: getting the remaining hosts for this loop 15794 1726882650.64108: done getting the remaining hosts for this loop 15794 1726882650.64113: getting the next task for host managed_node1 15794 1726882650.64120: done getting next task for host managed_node1 15794 1726882650.64126: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15794 1726882650.64129: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882650.64149: getting variables 15794 1726882650.64151: in VariableManager get_vars() 15794 1726882650.64195: Calling all_inventory to load vars for managed_node1 15794 1726882650.64199: Calling groups_inventory to load vars for managed_node1 15794 1726882650.64202: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882650.64217: Calling all_plugins_play to load vars for managed_node1 15794 1726882650.64221: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882650.64226: Calling groups_plugins_play to load vars for managed_node1 15794 1726882650.64991: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000065 15794 1726882650.64995: WORKER PROCESS EXITING 15794 1726882650.66030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882650.68094: done with get_vars() 15794 1726882650.68136: done getting variables 15794 1726882650.68233: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:30 -0400 (0:00:00.061) 0:00:48.240 ****** 15794 1726882650.68279: entering _queue_task() for managed_node1/package 15794 1726882650.68620: worker is 1 (out of 1 available) 15794 1726882650.68637: exiting _queue_task() for managed_node1/package 15794 1726882650.68650: done queuing things up, now waiting for results queue to drain 15794 1726882650.68652: waiting for pending results... 15794 1726882650.68873: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15794 1726882650.68974: in run() - task 0affe814-3a2d-94e5-e48f-000000000066 15794 1726882650.68989: variable 'ansible_search_path' from source: unknown 15794 1726882650.68993: variable 'ansible_search_path' from source: unknown 15794 1726882650.69031: calling self._execute() 15794 1726882650.69113: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.69117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.69125: variable 'omit' from source: magic vars 15794 1726882650.69450: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.69473: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882650.69646: variable 'network_state' from source: role '' defaults 15794 1726882650.69652: Evaluated conditional (network_state != {}): False 15794 1726882650.69656: when evaluation is False, skipping this task 15794 1726882650.69659: _execute() done 15794 1726882650.69664: dumping result to json 15794 1726882650.69666: done dumping result, returning 15794 1726882650.69738: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-94e5-e48f-000000000066] 15794 1726882650.69742: sending task result for task 0affe814-3a2d-94e5-e48f-000000000066 15794 1726882650.69823: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000066 15794 1726882650.69826: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882650.69902: no more pending results, returning what we have 15794 1726882650.69905: results queue empty 15794 1726882650.69907: checking for any_errors_fatal 15794 1726882650.69912: done checking for any_errors_fatal 15794 1726882650.69913: checking for max_fail_percentage 15794 1726882650.69915: done checking for max_fail_percentage 15794 1726882650.69916: checking to see if all hosts have failed and the running result is not ok 15794 1726882650.69917: done checking to see if all hosts have failed 15794 1726882650.69918: getting the remaining hosts for this loop 15794 1726882650.69920: done getting the remaining hosts for this loop 15794 1726882650.69923: getting the next task for host managed_node1 15794 1726882650.69932: done getting next task for host managed_node1 15794 1726882650.69937: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15794 1726882650.69939: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882650.69957: getting variables 15794 1726882650.69959: in VariableManager get_vars() 15794 1726882650.69998: Calling all_inventory to load vars for managed_node1 15794 1726882650.70001: Calling groups_inventory to load vars for managed_node1 15794 1726882650.70004: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882650.70027: Calling all_plugins_play to load vars for managed_node1 15794 1726882650.70031: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882650.70039: Calling groups_plugins_play to load vars for managed_node1 15794 1726882650.71783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882650.74093: done with get_vars() 15794 1726882650.74115: done getting variables 15794 1726882650.74177: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:30 -0400 (0:00:00.059) 0:00:48.300 ****** 15794 1726882650.74215: entering _queue_task() for managed_node1/service 15794 1726882650.74466: worker is 1 (out of 1 available) 15794 1726882650.74484: exiting _queue_task() for managed_node1/service 15794 1726882650.74501: done queuing things up, now waiting for results queue to drain 15794 1726882650.74503: waiting for pending results... 15794 1726882650.74705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15794 1726882650.74794: in run() - task 0affe814-3a2d-94e5-e48f-000000000067 15794 1726882650.74806: variable 'ansible_search_path' from source: unknown 15794 1726882650.74810: variable 'ansible_search_path' from source: unknown 15794 1726882650.74893: calling self._execute() 15794 1726882650.74963: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.74970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.74982: variable 'omit' from source: magic vars 15794 1726882650.75322: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.75333: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882650.75446: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882650.75693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882650.77904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882650.77960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882650.78002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882650.78034: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882650.78060: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882650.78128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.78157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.78179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.78216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.78228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.78274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.78296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.78319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.78353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.78368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.78407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.78428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.78451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.78489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.78502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.78646: variable 'network_connections' from source: play vars 15794 1726882650.78655: variable 'profile' from source: play vars 15794 1726882650.78720: variable 'profile' from source: play vars 15794 1726882650.78724: variable 'interface' from source: set_fact 15794 1726882650.78778: variable 'interface' from source: set_fact 15794 1726882650.78842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882650.78976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882650.79012: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882650.79042: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882650.79078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882650.79114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882650.79148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882650.79171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.79197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882650.79254: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882650.79455: variable 'network_connections' from source: play vars 15794 1726882650.79462: variable 'profile' from source: play vars 15794 1726882650.79517: variable 'profile' from source: play vars 15794 1726882650.79521: variable 'interface' from source: set_fact 15794 1726882650.79574: variable 'interface' from source: set_fact 15794 1726882650.79599: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15794 1726882650.79603: when evaluation is False, skipping this task 15794 1726882650.79605: _execute() done 15794 1726882650.79613: dumping result to json 15794 1726882650.79624: done dumping result, returning 15794 1726882650.79631: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-94e5-e48f-000000000067] 15794 1726882650.79642: sending task result for task 0affe814-3a2d-94e5-e48f-000000000067 15794 1726882650.79737: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000067 15794 1726882650.79740: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15794 1726882650.79794: no more pending results, returning what we have 15794 1726882650.79798: results queue empty 15794 1726882650.79799: checking for any_errors_fatal 15794 1726882650.79805: done checking for any_errors_fatal 15794 1726882650.79806: checking for max_fail_percentage 15794 1726882650.79808: done checking for max_fail_percentage 15794 1726882650.79809: checking to see if all hosts have failed and the running result is not ok 15794 1726882650.79810: done checking to see if all hosts have failed 15794 1726882650.79811: getting the remaining hosts for this loop 15794 1726882650.79813: done getting the remaining hosts for this loop 15794 1726882650.79817: getting the next task for host managed_node1 15794 1726882650.79823: done getting next task for host managed_node1 15794 1726882650.79828: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15794 1726882650.79830: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882650.79847: getting variables 15794 1726882650.79850: in VariableManager get_vars() 15794 1726882650.79887: Calling all_inventory to load vars for managed_node1 15794 1726882650.79890: Calling groups_inventory to load vars for managed_node1 15794 1726882650.79893: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882650.79902: Calling all_plugins_play to load vars for managed_node1 15794 1726882650.79906: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882650.79909: Calling groups_plugins_play to load vars for managed_node1 15794 1726882650.81274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882650.83738: done with get_vars() 15794 1726882650.83759: done getting variables 15794 1726882650.83809: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:30 -0400 (0:00:00.096) 0:00:48.396 ****** 15794 1726882650.83831: entering _queue_task() for managed_node1/service 15794 1726882650.84093: worker is 1 (out of 1 available) 15794 1726882650.84106: exiting _queue_task() for managed_node1/service 15794 1726882650.84122: done queuing things up, now waiting for results queue to drain 15794 1726882650.84125: waiting for pending results... 15794 1726882650.84421: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15794 1726882650.84545: in run() - task 0affe814-3a2d-94e5-e48f-000000000068 15794 1726882650.84552: variable 'ansible_search_path' from source: unknown 15794 1726882650.84555: variable 'ansible_search_path' from source: unknown 15794 1726882650.84593: calling self._execute() 15794 1726882650.84671: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.84677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.84691: variable 'omit' from source: magic vars 15794 1726882650.85113: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.85123: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882650.85284: variable 'network_provider' from source: set_fact 15794 1726882650.85293: variable 'network_state' from source: role '' defaults 15794 1726882650.85303: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15794 1726882650.85309: variable 'omit' from source: magic vars 15794 1726882650.85342: variable 'omit' from source: magic vars 15794 1726882650.85368: variable 'network_service_name' from source: role '' defaults 15794 1726882650.85438: variable 'network_service_name' from source: role '' defaults 15794 1726882650.85571: variable '__network_provider_setup' from source: role '' defaults 15794 1726882650.85579: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882650.85676: variable '__network_service_name_default_nm' from source: role '' defaults 15794 1726882650.85679: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882650.85721: variable '__network_packages_default_nm' from source: role '' defaults 15794 1726882650.86054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882650.88039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882650.88102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882650.88136: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882650.88167: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882650.88192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882650.88260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.88288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.88309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.88347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.88359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.88402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.88422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.88452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.88483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.88495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.88690: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15794 1726882650.88790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.88811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.88833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.88867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.88884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.88956: variable 'ansible_python' from source: facts 15794 1726882650.88976: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15794 1726882650.89060: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882650.89127: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882650.89237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.89258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.89285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.89318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.89332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.89375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882650.89400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882650.89423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.89457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882650.89469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882650.89585: variable 'network_connections' from source: play vars 15794 1726882650.89592: variable 'profile' from source: play vars 15794 1726882650.89657: variable 'profile' from source: play vars 15794 1726882650.89661: variable 'interface' from source: set_fact 15794 1726882650.89715: variable 'interface' from source: set_fact 15794 1726882650.89801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882650.89956: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882650.89997: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882650.90035: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882650.90073: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882650.90125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882650.90151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882650.90187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882650.90238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882650.90341: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882650.90631: variable 'network_connections' from source: play vars 15794 1726882650.90639: variable 'profile' from source: play vars 15794 1726882650.90738: variable 'profile' from source: play vars 15794 1726882650.90741: variable 'interface' from source: set_fact 15794 1726882650.90793: variable 'interface' from source: set_fact 15794 1726882650.90824: variable '__network_packages_default_wireless' from source: role '' defaults 15794 1726882650.90895: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882650.91155: variable 'network_connections' from source: play vars 15794 1726882650.91160: variable 'profile' from source: play vars 15794 1726882650.91239: variable 'profile' from source: play vars 15794 1726882650.91250: variable 'interface' from source: set_fact 15794 1726882650.91314: variable 'interface' from source: set_fact 15794 1726882650.91338: variable '__network_packages_default_team' from source: role '' defaults 15794 1726882650.91409: variable '__network_team_connections_defined' from source: role '' defaults 15794 1726882650.91725: variable 'network_connections' from source: play vars 15794 1726882650.91731: variable 'profile' from source: play vars 15794 1726882650.91792: variable 'profile' from source: play vars 15794 1726882650.91795: variable 'interface' from source: set_fact 15794 1726882650.91861: variable 'interface' from source: set_fact 15794 1726882650.91908: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882650.91964: variable '__network_service_name_default_initscripts' from source: role '' defaults 15794 1726882650.91971: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882650.92028: variable '__network_packages_default_initscripts' from source: role '' defaults 15794 1726882650.92214: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15794 1726882650.92788: variable 'network_connections' from source: play vars 15794 1726882650.92792: variable 'profile' from source: play vars 15794 1726882650.92849: variable 'profile' from source: play vars 15794 1726882650.92852: variable 'interface' from source: set_fact 15794 1726882650.92914: variable 'interface' from source: set_fact 15794 1726882650.92921: variable 'ansible_distribution' from source: facts 15794 1726882650.92927: variable '__network_rh_distros' from source: role '' defaults 15794 1726882650.92937: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.92948: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15794 1726882650.93097: variable 'ansible_distribution' from source: facts 15794 1726882650.93100: variable '__network_rh_distros' from source: role '' defaults 15794 1726882650.93106: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.93113: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15794 1726882650.93263: variable 'ansible_distribution' from source: facts 15794 1726882650.93267: variable '__network_rh_distros' from source: role '' defaults 15794 1726882650.93273: variable 'ansible_distribution_major_version' from source: facts 15794 1726882650.93304: variable 'network_provider' from source: set_fact 15794 1726882650.93323: variable 'omit' from source: magic vars 15794 1726882650.93349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882650.93375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882650.93393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882650.93409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882650.93418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882650.93446: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882650.93449: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.93455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.93544: Set connection var ansible_connection to ssh 15794 1726882650.93554: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882650.93560: Set connection var ansible_pipelining to False 15794 1726882650.93569: Set connection var ansible_shell_executable to /bin/sh 15794 1726882650.93571: Set connection var ansible_shell_type to sh 15794 1726882650.93589: Set connection var ansible_timeout to 10 15794 1726882650.93610: variable 'ansible_shell_executable' from source: unknown 15794 1726882650.93613: variable 'ansible_connection' from source: unknown 15794 1726882650.93616: variable 'ansible_module_compression' from source: unknown 15794 1726882650.93620: variable 'ansible_shell_type' from source: unknown 15794 1726882650.93623: variable 'ansible_shell_executable' from source: unknown 15794 1726882650.93628: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882650.93635: variable 'ansible_pipelining' from source: unknown 15794 1726882650.93638: variable 'ansible_timeout' from source: unknown 15794 1726882650.93643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882650.93756: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882650.93765: variable 'omit' from source: magic vars 15794 1726882650.93772: starting attempt loop 15794 1726882650.93775: running the handler 15794 1726882650.93878: variable 'ansible_facts' from source: unknown 15794 1726882650.94748: _low_level_execute_command(): starting 15794 1726882650.94755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882650.95329: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882650.95333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882650.95339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882650.95342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882650.95399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882650.95403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882650.95405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882650.95478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882650.97364: stdout chunk (state=3): >>>/root <<< 15794 1726882650.97394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882650.97495: stderr chunk (state=3): >>><<< 15794 1726882650.97499: stdout chunk (state=3): >>><<< 15794 1726882650.97641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882650.97645: _low_level_execute_command(): starting 15794 1726882650.97649: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650 `" && echo ansible-tmp-1726882650.9752455-17606-234145751494650="` echo /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650 `" ) && sleep 0' 15794 1726882650.98439: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882650.98470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882650.98561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882650.98586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882650.98638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882650.98770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882651.00758: stdout chunk (state=3): >>>ansible-tmp-1726882650.9752455-17606-234145751494650=/root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650 <<< 15794 1726882651.00942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882651.00958: stderr chunk (state=3): >>><<< 15794 1726882651.00970: stdout chunk (state=3): >>><<< 15794 1726882651.00997: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882650.9752455-17606-234145751494650=/root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882651.01058: variable 'ansible_module_compression' from source: unknown 15794 1726882651.01112: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15794 1726882651.01339: variable 'ansible_facts' from source: unknown 15794 1726882651.01410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py 15794 1726882651.01586: Sending initial data 15794 1726882651.01690: Sent initial data (156 bytes) 15794 1726882651.02444: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882651.02462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882651.02482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882651.02505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882651.02527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882651.02554: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882651.02666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882651.02686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882651.02709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882651.02952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882651.04536: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15794 1726882651.04540: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882651.04592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882651.04655: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpy7jcg8qg /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py <<< 15794 1726882651.04662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py" <<< 15794 1726882651.04710: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpy7jcg8qg" to remote "/root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py" <<< 15794 1726882651.06855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882651.06931: stderr chunk (state=3): >>><<< 15794 1726882651.06937: stdout chunk (state=3): >>><<< 15794 1726882651.06977: done transferring module to remote 15794 1726882651.06985: _low_level_execute_command(): starting 15794 1726882651.06988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/ /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py && sleep 0' 15794 1726882651.07833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882651.07884: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882651.08046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882651.08063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882651.08118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882651.09923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882651.09969: stderr chunk (state=3): >>><<< 15794 1726882651.09972: stdout chunk (state=3): >>><<< 15794 1726882651.09989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882651.09992: _low_level_execute_command(): starting 15794 1726882651.09998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/AnsiballZ_systemd.py && sleep 0' 15794 1726882651.10407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882651.10511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882651.10515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882651.10517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882651.10520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882651.10531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882651.10550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882651.10619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882651.42855: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "652", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ExecMainStartTimestampMonotonic": "15833159", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "652", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_ti<<< 15794 1726882651.42886: stdout chunk (state=3): >>>me=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11935744", "MemoryAvailable": "infinity", "CPUUsageNSec": "1299687000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "Ca<<< 15794 1726882651.42906: stdout chunk (state=3): >>>cheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "network-pre.target system.slice systemd-journald.socket dbus.socket sysinit.target dbus-broker.service cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:32 EDT", "StateChangeTimestampMonotonic": "366878571", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:41 EDT", "InactiveExitTimestampMonotonic": "15833421", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ActiveEnterTimestampMonotonic": "15948855", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ConditionTimestampMonotonic": "15822215", "AssertTimestamp": "Fri 2024-09-20 21:27:41 EDT", "AssertTimestampMonotonic": "15822218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "9d67906d6bf74ff48c21207bf47afee4", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15794 1726882651.44957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882651.45021: stderr chunk (state=3): >>><<< 15794 1726882651.45024: stdout chunk (state=3): >>><<< 15794 1726882651.45044: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "652", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ExecMainStartTimestampMonotonic": "15833159", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "652", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11935744", "MemoryAvailable": "infinity", "CPUUsageNSec": "1299687000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "network-pre.target system.slice systemd-journald.socket dbus.socket sysinit.target dbus-broker.service cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:32 EDT", "StateChangeTimestampMonotonic": "366878571", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:41 EDT", "InactiveExitTimestampMonotonic": "15833421", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ActiveEnterTimestampMonotonic": "15948855", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:41 EDT", "ConditionTimestampMonotonic": "15822215", "AssertTimestamp": "Fri 2024-09-20 21:27:41 EDT", "AssertTimestampMonotonic": "15822218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "9d67906d6bf74ff48c21207bf47afee4", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882651.45222: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882651.45245: _low_level_execute_command(): starting 15794 1726882651.45248: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882650.9752455-17606-234145751494650/ > /dev/null 2>&1 && sleep 0' 15794 1726882651.45715: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882651.45756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882651.45759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882651.45762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882651.45764: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882651.45766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882651.45811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882651.45824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882651.45891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882651.47767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882651.47820: stderr chunk (state=3): >>><<< 15794 1726882651.47825: stdout chunk (state=3): >>><<< 15794 1726882651.47840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882651.47849: handler run complete 15794 1726882651.47898: attempt loop complete, returning result 15794 1726882651.47905: _execute() done 15794 1726882651.47908: dumping result to json 15794 1726882651.47924: done dumping result, returning 15794 1726882651.47935: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-94e5-e48f-000000000068] 15794 1726882651.47942: sending task result for task 0affe814-3a2d-94e5-e48f-000000000068 15794 1726882651.48236: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000068 15794 1726882651.48240: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882651.48304: no more pending results, returning what we have 15794 1726882651.48307: results queue empty 15794 1726882651.48308: checking for any_errors_fatal 15794 1726882651.48314: done checking for any_errors_fatal 15794 1726882651.48315: checking for max_fail_percentage 15794 1726882651.48317: done checking for max_fail_percentage 15794 1726882651.48317: checking to see if all hosts have failed and the running result is not ok 15794 1726882651.48318: done checking to see if all hosts have failed 15794 1726882651.48319: getting the remaining hosts for this loop 15794 1726882651.48322: done getting the remaining hosts for this loop 15794 1726882651.48326: getting the next task for host managed_node1 15794 1726882651.48332: done getting next task for host managed_node1 15794 1726882651.48338: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15794 1726882651.48341: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882651.48360: getting variables 15794 1726882651.48362: in VariableManager get_vars() 15794 1726882651.48399: Calling all_inventory to load vars for managed_node1 15794 1726882651.48402: Calling groups_inventory to load vars for managed_node1 15794 1726882651.48405: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882651.48415: Calling all_plugins_play to load vars for managed_node1 15794 1726882651.48418: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882651.48422: Calling groups_plugins_play to load vars for managed_node1 15794 1726882651.49801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882651.51430: done with get_vars() 15794 1726882651.51455: done getting variables 15794 1726882651.51511: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:31 -0400 (0:00:00.677) 0:00:49.073 ****** 15794 1726882651.51538: entering _queue_task() for managed_node1/service 15794 1726882651.51810: worker is 1 (out of 1 available) 15794 1726882651.51825: exiting _queue_task() for managed_node1/service 15794 1726882651.51840: done queuing things up, now waiting for results queue to drain 15794 1726882651.51842: waiting for pending results... 15794 1726882651.52373: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15794 1726882651.52378: in run() - task 0affe814-3a2d-94e5-e48f-000000000069 15794 1726882651.52385: variable 'ansible_search_path' from source: unknown 15794 1726882651.52387: variable 'ansible_search_path' from source: unknown 15794 1726882651.52390: calling self._execute() 15794 1726882651.52507: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882651.52511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882651.52514: variable 'omit' from source: magic vars 15794 1726882651.53001: variable 'ansible_distribution_major_version' from source: facts 15794 1726882651.53037: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882651.53136: variable 'network_provider' from source: set_fact 15794 1726882651.53142: Evaluated conditional (network_provider == "nm"): True 15794 1726882651.53223: variable '__network_wpa_supplicant_required' from source: role '' defaults 15794 1726882651.53306: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15794 1726882651.53467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882651.55170: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882651.55229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882651.55262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882651.55304: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882651.55326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882651.55405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882651.55433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882651.55458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882651.55494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882651.55507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882651.55553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882651.55573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882651.55597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882651.55632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882651.55646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882651.55698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882651.55736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882651.55762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882651.55795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882651.55809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882651.56025: variable 'network_connections' from source: play vars 15794 1726882651.56029: variable 'profile' from source: play vars 15794 1726882651.56243: variable 'profile' from source: play vars 15794 1726882651.56247: variable 'interface' from source: set_fact 15794 1726882651.56250: variable 'interface' from source: set_fact 15794 1726882651.56262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15794 1726882651.56446: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15794 1726882651.56491: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15794 1726882651.56532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15794 1726882651.56610: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15794 1726882651.56709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15794 1726882651.56747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15794 1726882651.56794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882651.56826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15794 1726882651.57146: variable '__network_wireless_connections_defined' from source: role '' defaults 15794 1726882651.57345: variable 'network_connections' from source: play vars 15794 1726882651.57370: variable 'profile' from source: play vars 15794 1726882651.57495: variable 'profile' from source: play vars 15794 1726882651.57500: variable 'interface' from source: set_fact 15794 1726882651.57588: variable 'interface' from source: set_fact 15794 1726882651.57664: Evaluated conditional (__network_wpa_supplicant_required): False 15794 1726882651.57671: when evaluation is False, skipping this task 15794 1726882651.57674: _execute() done 15794 1726882651.57686: dumping result to json 15794 1726882651.57689: done dumping result, returning 15794 1726882651.57692: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-94e5-e48f-000000000069] 15794 1726882651.57694: sending task result for task 0affe814-3a2d-94e5-e48f-000000000069 15794 1726882651.57808: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000069 15794 1726882651.58058: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15794 1726882651.58152: no more pending results, returning what we have 15794 1726882651.58156: results queue empty 15794 1726882651.58157: checking for any_errors_fatal 15794 1726882651.58275: done checking for any_errors_fatal 15794 1726882651.58277: checking for max_fail_percentage 15794 1726882651.58282: done checking for max_fail_percentage 15794 1726882651.58283: checking to see if all hosts have failed and the running result is not ok 15794 1726882651.58284: done checking to see if all hosts have failed 15794 1726882651.58285: getting the remaining hosts for this loop 15794 1726882651.58286: done getting the remaining hosts for this loop 15794 1726882651.58291: getting the next task for host managed_node1 15794 1726882651.58297: done getting next task for host managed_node1 15794 1726882651.58302: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15794 1726882651.58304: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882651.58476: getting variables 15794 1726882651.58481: in VariableManager get_vars() 15794 1726882651.58522: Calling all_inventory to load vars for managed_node1 15794 1726882651.58525: Calling groups_inventory to load vars for managed_node1 15794 1726882651.58528: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882651.58541: Calling all_plugins_play to load vars for managed_node1 15794 1726882651.58545: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882651.58549: Calling groups_plugins_play to load vars for managed_node1 15794 1726882651.62157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882651.63896: done with get_vars() 15794 1726882651.63921: done getting variables 15794 1726882651.63973: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:31 -0400 (0:00:00.124) 0:00:49.197 ****** 15794 1726882651.64001: entering _queue_task() for managed_node1/service 15794 1726882651.64269: worker is 1 (out of 1 available) 15794 1726882651.64288: exiting _queue_task() for managed_node1/service 15794 1726882651.64303: done queuing things up, now waiting for results queue to drain 15794 1726882651.64305: waiting for pending results... 15794 1726882651.64519: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15794 1726882651.64646: in run() - task 0affe814-3a2d-94e5-e48f-00000000006a 15794 1726882651.64650: variable 'ansible_search_path' from source: unknown 15794 1726882651.64760: variable 'ansible_search_path' from source: unknown 15794 1726882651.64764: calling self._execute() 15794 1726882651.64941: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882651.64945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882651.64948: variable 'omit' from source: magic vars 15794 1726882651.65424: variable 'ansible_distribution_major_version' from source: facts 15794 1726882651.65428: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882651.65530: variable 'network_provider' from source: set_fact 15794 1726882651.65540: Evaluated conditional (network_provider == "initscripts"): False 15794 1726882651.65544: when evaluation is False, skipping this task 15794 1726882651.65547: _execute() done 15794 1726882651.65549: dumping result to json 15794 1726882651.65552: done dumping result, returning 15794 1726882651.65555: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-94e5-e48f-00000000006a] 15794 1726882651.65556: sending task result for task 0affe814-3a2d-94e5-e48f-00000000006a 15794 1726882651.65620: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000006a 15794 1726882651.65623: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15794 1726882651.65675: no more pending results, returning what we have 15794 1726882651.65680: results queue empty 15794 1726882651.65681: checking for any_errors_fatal 15794 1726882651.65691: done checking for any_errors_fatal 15794 1726882651.65692: checking for max_fail_percentage 15794 1726882651.65694: done checking for max_fail_percentage 15794 1726882651.65694: checking to see if all hosts have failed and the running result is not ok 15794 1726882651.65695: done checking to see if all hosts have failed 15794 1726882651.65696: getting the remaining hosts for this loop 15794 1726882651.65699: done getting the remaining hosts for this loop 15794 1726882651.65703: getting the next task for host managed_node1 15794 1726882651.65709: done getting next task for host managed_node1 15794 1726882651.65714: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15794 1726882651.65716: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882651.65731: getting variables 15794 1726882651.65733: in VariableManager get_vars() 15794 1726882651.65768: Calling all_inventory to load vars for managed_node1 15794 1726882651.65771: Calling groups_inventory to load vars for managed_node1 15794 1726882651.65774: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882651.65784: Calling all_plugins_play to load vars for managed_node1 15794 1726882651.65787: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882651.65790: Calling groups_plugins_play to load vars for managed_node1 15794 1726882651.69002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882651.74987: done with get_vars() 15794 1726882651.75038: done getting variables 15794 1726882651.75115: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:31 -0400 (0:00:00.113) 0:00:49.311 ****** 15794 1726882651.75362: entering _queue_task() for managed_node1/copy 15794 1726882651.76168: worker is 1 (out of 1 available) 15794 1726882651.76182: exiting _queue_task() for managed_node1/copy 15794 1726882651.76194: done queuing things up, now waiting for results queue to drain 15794 1726882651.76195: waiting for pending results... 15794 1726882651.76756: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15794 1726882651.76944: in run() - task 0affe814-3a2d-94e5-e48f-00000000006b 15794 1726882651.76949: variable 'ansible_search_path' from source: unknown 15794 1726882651.76952: variable 'ansible_search_path' from source: unknown 15794 1726882651.77020: calling self._execute() 15794 1726882651.77277: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882651.77292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882651.77378: variable 'omit' from source: magic vars 15794 1726882651.78314: variable 'ansible_distribution_major_version' from source: facts 15794 1726882651.78423: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882651.78697: variable 'network_provider' from source: set_fact 15794 1726882651.78712: Evaluated conditional (network_provider == "initscripts"): False 15794 1726882651.78748: when evaluation is False, skipping this task 15794 1726882651.78758: _execute() done 15794 1726882651.78766: dumping result to json 15794 1726882651.78906: done dumping result, returning 15794 1726882651.78911: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-94e5-e48f-00000000006b] 15794 1726882651.78913: sending task result for task 0affe814-3a2d-94e5-e48f-00000000006b 15794 1726882651.79233: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000006b 15794 1726882651.79239: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15794 1726882651.79294: no more pending results, returning what we have 15794 1726882651.79298: results queue empty 15794 1726882651.79300: checking for any_errors_fatal 15794 1726882651.79306: done checking for any_errors_fatal 15794 1726882651.79307: checking for max_fail_percentage 15794 1726882651.79309: done checking for max_fail_percentage 15794 1726882651.79309: checking to see if all hosts have failed and the running result is not ok 15794 1726882651.79310: done checking to see if all hosts have failed 15794 1726882651.79311: getting the remaining hosts for this loop 15794 1726882651.79314: done getting the remaining hosts for this loop 15794 1726882651.79318: getting the next task for host managed_node1 15794 1726882651.79323: done getting next task for host managed_node1 15794 1726882651.79327: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15794 1726882651.79330: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882651.79347: getting variables 15794 1726882651.79349: in VariableManager get_vars() 15794 1726882651.79392: Calling all_inventory to load vars for managed_node1 15794 1726882651.79396: Calling groups_inventory to load vars for managed_node1 15794 1726882651.79399: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882651.79411: Calling all_plugins_play to load vars for managed_node1 15794 1726882651.79415: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882651.79418: Calling groups_plugins_play to load vars for managed_node1 15794 1726882651.83741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882651.87803: done with get_vars() 15794 1726882651.87845: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:31 -0400 (0:00:00.125) 0:00:49.437 ****** 15794 1726882651.87950: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15794 1726882651.88331: worker is 1 (out of 1 available) 15794 1726882651.88352: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15794 1726882651.88366: done queuing things up, now waiting for results queue to drain 15794 1726882651.88367: waiting for pending results... 15794 1726882651.88665: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15794 1726882651.88912: in run() - task 0affe814-3a2d-94e5-e48f-00000000006c 15794 1726882651.88919: variable 'ansible_search_path' from source: unknown 15794 1726882651.88923: variable 'ansible_search_path' from source: unknown 15794 1726882651.88926: calling self._execute() 15794 1726882651.89019: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882651.89052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882651.89071: variable 'omit' from source: magic vars 15794 1726882651.89609: variable 'ansible_distribution_major_version' from source: facts 15794 1726882651.89666: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882651.89669: variable 'omit' from source: magic vars 15794 1726882651.89713: variable 'omit' from source: magic vars 15794 1726882651.89948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15794 1726882651.92904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15794 1726882651.93039: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15794 1726882651.93061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15794 1726882651.93121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15794 1726882651.93161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15794 1726882651.93271: variable 'network_provider' from source: set_fact 15794 1726882651.93526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15794 1726882651.93530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15794 1726882651.93557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15794 1726882651.93612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15794 1726882651.93651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15794 1726882651.93762: variable 'omit' from source: magic vars 15794 1726882651.93915: variable 'omit' from source: magic vars 15794 1726882651.94110: variable 'network_connections' from source: play vars 15794 1726882651.94127: variable 'profile' from source: play vars 15794 1726882651.94250: variable 'profile' from source: play vars 15794 1726882651.94262: variable 'interface' from source: set_fact 15794 1726882651.94373: variable 'interface' from source: set_fact 15794 1726882651.94867: variable 'omit' from source: magic vars 15794 1726882651.94871: variable '__lsr_ansible_managed' from source: task vars 15794 1726882651.95014: variable '__lsr_ansible_managed' from source: task vars 15794 1726882651.96039: Loaded config def from plugin (lookup/template) 15794 1726882651.96043: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15794 1726882651.96045: File lookup term: get_ansible_managed.j2 15794 1726882651.96048: variable 'ansible_search_path' from source: unknown 15794 1726882651.96050: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15794 1726882651.96055: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15794 1726882651.96058: variable 'ansible_search_path' from source: unknown 15794 1726882652.06923: variable 'ansible_managed' from source: unknown 15794 1726882652.07293: variable 'omit' from source: magic vars 15794 1726882652.07348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882652.07465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882652.07520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882652.07573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.07623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.07842: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882652.07846: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.07849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.07852: Set connection var ansible_connection to ssh 15794 1726882652.07854: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882652.07856: Set connection var ansible_pipelining to False 15794 1726882652.07858: Set connection var ansible_shell_executable to /bin/sh 15794 1726882652.07860: Set connection var ansible_shell_type to sh 15794 1726882652.07873: Set connection var ansible_timeout to 10 15794 1726882652.07917: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.07931: variable 'ansible_connection' from source: unknown 15794 1726882652.07945: variable 'ansible_module_compression' from source: unknown 15794 1726882652.07955: variable 'ansible_shell_type' from source: unknown 15794 1726882652.07963: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.07971: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.07983: variable 'ansible_pipelining' from source: unknown 15794 1726882652.07992: variable 'ansible_timeout' from source: unknown 15794 1726882652.08001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.08173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882652.08204: variable 'omit' from source: magic vars 15794 1726882652.08219: starting attempt loop 15794 1726882652.08229: running the handler 15794 1726882652.08252: _low_level_execute_command(): starting 15794 1726882652.08274: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882652.09500: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882652.09522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.09588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.09729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882652.09750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.09806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.09919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.11686: stdout chunk (state=3): >>>/root <<< 15794 1726882652.11803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882652.11862: stderr chunk (state=3): >>><<< 15794 1726882652.11865: stdout chunk (state=3): >>><<< 15794 1726882652.11884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882652.11909: _low_level_execute_command(): starting 15794 1726882652.11913: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932 `" && echo ansible-tmp-1726882652.1188836-17640-196503557335932="` echo /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932 `" ) && sleep 0' 15794 1726882652.12407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882652.12411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882652.12413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.12415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882652.12418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.12476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.12481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.12544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.14513: stdout chunk (state=3): >>>ansible-tmp-1726882652.1188836-17640-196503557335932=/root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932 <<< 15794 1726882652.14632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882652.14676: stderr chunk (state=3): >>><<< 15794 1726882652.14683: stdout chunk (state=3): >>><<< 15794 1726882652.14695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882652.1188836-17640-196503557335932=/root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882652.14736: variable 'ansible_module_compression' from source: unknown 15794 1726882652.14774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15794 1726882652.14813: variable 'ansible_facts' from source: unknown 15794 1726882652.14905: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py 15794 1726882652.15016: Sending initial data 15794 1726882652.15020: Sent initial data (168 bytes) 15794 1726882652.15433: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882652.15475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882652.15480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882652.15486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882652.15489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882652.15491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.15523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882652.15531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.15557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.15614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.17195: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882652.17250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882652.17310: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpvcobqsl9 /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py <<< 15794 1726882652.17313: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py" <<< 15794 1726882652.17358: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpvcobqsl9" to remote "/root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py" <<< 15794 1726882652.17369: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py" <<< 15794 1726882652.18555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882652.18608: stderr chunk (state=3): >>><<< 15794 1726882652.18612: stdout chunk (state=3): >>><<< 15794 1726882652.18633: done transferring module to remote 15794 1726882652.18644: _low_level_execute_command(): starting 15794 1726882652.18648: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/ /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py && sleep 0' 15794 1726882652.19166: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882652.19170: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882652.19172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.19224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.19241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.19296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.21171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882652.21195: stderr chunk (state=3): >>><<< 15794 1726882652.21198: stdout chunk (state=3): >>><<< 15794 1726882652.21230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882652.21236: _low_level_execute_command(): starting 15794 1726882652.21239: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/AnsiballZ_network_connections.py && sleep 0' 15794 1726882652.21708: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882652.21712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.21717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882652.21720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882652.21723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.21787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.21818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.21885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.51659: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kes8kdjz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kes8kdjz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/666af3c8-45ed-476e-bdc6-601fe256e49b: error=unknown <<< 15794 1726882652.51825: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15794 1726882652.53872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882652.53876: stdout chunk (state=3): >>><<< 15794 1726882652.53881: stderr chunk (state=3): >>><<< 15794 1726882652.53904: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kes8kdjz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kes8kdjz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/666af3c8-45ed-476e-bdc6-601fe256e49b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882652.54052: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882652.54056: _low_level_execute_command(): starting 15794 1726882652.54059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882652.1188836-17640-196503557335932/ > /dev/null 2>&1 && sleep 0' 15794 1726882652.54764: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882652.54821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.54926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.54949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.55046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.57372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882652.57466: stderr chunk (state=3): >>><<< 15794 1726882652.57470: stdout chunk (state=3): >>><<< 15794 1726882652.57473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882652.57482: handler run complete 15794 1726882652.57484: attempt loop complete, returning result 15794 1726882652.57487: _execute() done 15794 1726882652.57489: dumping result to json 15794 1726882652.57491: done dumping result, returning 15794 1726882652.57493: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-94e5-e48f-00000000006c] 15794 1726882652.57495: sending task result for task 0affe814-3a2d-94e5-e48f-00000000006c 15794 1726882652.57695: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000006c 15794 1726882652.57698: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15794 1726882652.57866: no more pending results, returning what we have 15794 1726882652.57870: results queue empty 15794 1726882652.57872: checking for any_errors_fatal 15794 1726882652.57882: done checking for any_errors_fatal 15794 1726882652.57883: checking for max_fail_percentage 15794 1726882652.57885: done checking for max_fail_percentage 15794 1726882652.57886: checking to see if all hosts have failed and the running result is not ok 15794 1726882652.57887: done checking to see if all hosts have failed 15794 1726882652.57888: getting the remaining hosts for this loop 15794 1726882652.57891: done getting the remaining hosts for this loop 15794 1726882652.57896: getting the next task for host managed_node1 15794 1726882652.57905: done getting next task for host managed_node1 15794 1726882652.57910: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15794 1726882652.57912: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882652.57924: getting variables 15794 1726882652.57926: in VariableManager get_vars() 15794 1726882652.58276: Calling all_inventory to load vars for managed_node1 15794 1726882652.58280: Calling groups_inventory to load vars for managed_node1 15794 1726882652.58284: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882652.58296: Calling all_plugins_play to load vars for managed_node1 15794 1726882652.58301: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882652.58305: Calling groups_plugins_play to load vars for managed_node1 15794 1726882652.59915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882652.62206: done with get_vars() 15794 1726882652.62229: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:32 -0400 (0:00:00.743) 0:00:50.181 ****** 15794 1726882652.62309: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15794 1726882652.62570: worker is 1 (out of 1 available) 15794 1726882652.62595: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15794 1726882652.62609: done queuing things up, now waiting for results queue to drain 15794 1726882652.62611: waiting for pending results... 15794 1726882652.62930: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15794 1726882652.62961: in run() - task 0affe814-3a2d-94e5-e48f-00000000006d 15794 1726882652.63024: variable 'ansible_search_path' from source: unknown 15794 1726882652.63028: variable 'ansible_search_path' from source: unknown 15794 1726882652.63032: calling self._execute() 15794 1726882652.63448: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.63452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.63455: variable 'omit' from source: magic vars 15794 1726882652.63587: variable 'ansible_distribution_major_version' from source: facts 15794 1726882652.63599: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882652.63754: variable 'network_state' from source: role '' defaults 15794 1726882652.63767: Evaluated conditional (network_state != {}): False 15794 1726882652.63771: when evaluation is False, skipping this task 15794 1726882652.63774: _execute() done 15794 1726882652.63776: dumping result to json 15794 1726882652.63787: done dumping result, returning 15794 1726882652.63793: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-94e5-e48f-00000000006d] 15794 1726882652.63800: sending task result for task 0affe814-3a2d-94e5-e48f-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15794 1726882652.63963: no more pending results, returning what we have 15794 1726882652.63967: results queue empty 15794 1726882652.63968: checking for any_errors_fatal 15794 1726882652.63977: done checking for any_errors_fatal 15794 1726882652.63978: checking for max_fail_percentage 15794 1726882652.63979: done checking for max_fail_percentage 15794 1726882652.63981: checking to see if all hosts have failed and the running result is not ok 15794 1726882652.63982: done checking to see if all hosts have failed 15794 1726882652.63983: getting the remaining hosts for this loop 15794 1726882652.63985: done getting the remaining hosts for this loop 15794 1726882652.63990: getting the next task for host managed_node1 15794 1726882652.64003: done getting next task for host managed_node1 15794 1726882652.64008: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15794 1726882652.64010: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882652.64024: getting variables 15794 1726882652.64026: in VariableManager get_vars() 15794 1726882652.64062: Calling all_inventory to load vars for managed_node1 15794 1726882652.64065: Calling groups_inventory to load vars for managed_node1 15794 1726882652.64067: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882652.64077: Calling all_plugins_play to load vars for managed_node1 15794 1726882652.64080: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882652.64084: Calling groups_plugins_play to load vars for managed_node1 15794 1726882652.64617: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000006d 15794 1726882652.64621: WORKER PROCESS EXITING 15794 1726882652.66032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882652.67627: done with get_vars() 15794 1726882652.67652: done getting variables 15794 1726882652.67706: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:32 -0400 (0:00:00.054) 0:00:50.235 ****** 15794 1726882652.67732: entering _queue_task() for managed_node1/debug 15794 1726882652.67990: worker is 1 (out of 1 available) 15794 1726882652.68004: exiting _queue_task() for managed_node1/debug 15794 1726882652.68016: done queuing things up, now waiting for results queue to drain 15794 1726882652.68018: waiting for pending results... 15794 1726882652.68231: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15794 1726882652.68337: in run() - task 0affe814-3a2d-94e5-e48f-00000000006e 15794 1726882652.68365: variable 'ansible_search_path' from source: unknown 15794 1726882652.68370: variable 'ansible_search_path' from source: unknown 15794 1726882652.68406: calling self._execute() 15794 1726882652.68514: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.68543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.68546: variable 'omit' from source: magic vars 15794 1726882652.69439: variable 'ansible_distribution_major_version' from source: facts 15794 1726882652.69443: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882652.69447: variable 'omit' from source: magic vars 15794 1726882652.69450: variable 'omit' from source: magic vars 15794 1726882652.69453: variable 'omit' from source: magic vars 15794 1726882652.69455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882652.69458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882652.69461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882652.69464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.69466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.69469: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882652.69472: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.69474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.69476: Set connection var ansible_connection to ssh 15794 1726882652.69479: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882652.69481: Set connection var ansible_pipelining to False 15794 1726882652.69483: Set connection var ansible_shell_executable to /bin/sh 15794 1726882652.69485: Set connection var ansible_shell_type to sh 15794 1726882652.69488: Set connection var ansible_timeout to 10 15794 1726882652.69522: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.69525: variable 'ansible_connection' from source: unknown 15794 1726882652.69529: variable 'ansible_module_compression' from source: unknown 15794 1726882652.69532: variable 'ansible_shell_type' from source: unknown 15794 1726882652.69536: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.69605: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.69609: variable 'ansible_pipelining' from source: unknown 15794 1726882652.69611: variable 'ansible_timeout' from source: unknown 15794 1726882652.69614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.69721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882652.69823: variable 'omit' from source: magic vars 15794 1726882652.69827: starting attempt loop 15794 1726882652.69829: running the handler 15794 1726882652.69912: variable '__network_connections_result' from source: set_fact 15794 1726882652.69971: handler run complete 15794 1726882652.70005: attempt loop complete, returning result 15794 1726882652.70008: _execute() done 15794 1726882652.70011: dumping result to json 15794 1726882652.70016: done dumping result, returning 15794 1726882652.70027: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-94e5-e48f-00000000006e] 15794 1726882652.70039: sending task result for task 0affe814-3a2d-94e5-e48f-00000000006e 15794 1726882652.70129: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000006e 15794 1726882652.70133: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15794 1726882652.70259: no more pending results, returning what we have 15794 1726882652.70264: results queue empty 15794 1726882652.70265: checking for any_errors_fatal 15794 1726882652.70273: done checking for any_errors_fatal 15794 1726882652.70274: checking for max_fail_percentage 15794 1726882652.70276: done checking for max_fail_percentage 15794 1726882652.70278: checking to see if all hosts have failed and the running result is not ok 15794 1726882652.70281: done checking to see if all hosts have failed 15794 1726882652.70282: getting the remaining hosts for this loop 15794 1726882652.70285: done getting the remaining hosts for this loop 15794 1726882652.70290: getting the next task for host managed_node1 15794 1726882652.70298: done getting next task for host managed_node1 15794 1726882652.70304: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15794 1726882652.70307: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882652.70318: getting variables 15794 1726882652.70320: in VariableManager get_vars() 15794 1726882652.70541: Calling all_inventory to load vars for managed_node1 15794 1726882652.70544: Calling groups_inventory to load vars for managed_node1 15794 1726882652.70547: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882652.70557: Calling all_plugins_play to load vars for managed_node1 15794 1726882652.70561: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882652.70564: Calling groups_plugins_play to load vars for managed_node1 15794 1726882652.73003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882652.75514: done with get_vars() 15794 1726882652.75543: done getting variables 15794 1726882652.75600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:32 -0400 (0:00:00.078) 0:00:50.314 ****** 15794 1726882652.75627: entering _queue_task() for managed_node1/debug 15794 1726882652.75916: worker is 1 (out of 1 available) 15794 1726882652.75931: exiting _queue_task() for managed_node1/debug 15794 1726882652.75948: done queuing things up, now waiting for results queue to drain 15794 1726882652.75949: waiting for pending results... 15794 1726882652.76136: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15794 1726882652.76225: in run() - task 0affe814-3a2d-94e5-e48f-00000000006f 15794 1726882652.76242: variable 'ansible_search_path' from source: unknown 15794 1726882652.76246: variable 'ansible_search_path' from source: unknown 15794 1726882652.76282: calling self._execute() 15794 1726882652.76357: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.76364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.76374: variable 'omit' from source: magic vars 15794 1726882652.76699: variable 'ansible_distribution_major_version' from source: facts 15794 1726882652.76710: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882652.76716: variable 'omit' from source: magic vars 15794 1726882652.76757: variable 'omit' from source: magic vars 15794 1726882652.76791: variable 'omit' from source: magic vars 15794 1726882652.76826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882652.76861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882652.76883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882652.76899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.76910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.76942: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882652.76948: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.76951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.77035: Set connection var ansible_connection to ssh 15794 1726882652.77043: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882652.77051: Set connection var ansible_pipelining to False 15794 1726882652.77061: Set connection var ansible_shell_executable to /bin/sh 15794 1726882652.77065: Set connection var ansible_shell_type to sh 15794 1726882652.77074: Set connection var ansible_timeout to 10 15794 1726882652.77099: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.77103: variable 'ansible_connection' from source: unknown 15794 1726882652.77107: variable 'ansible_module_compression' from source: unknown 15794 1726882652.77109: variable 'ansible_shell_type' from source: unknown 15794 1726882652.77112: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.77117: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.77122: variable 'ansible_pipelining' from source: unknown 15794 1726882652.77124: variable 'ansible_timeout' from source: unknown 15794 1726882652.77130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.77252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882652.77264: variable 'omit' from source: magic vars 15794 1726882652.77270: starting attempt loop 15794 1726882652.77274: running the handler 15794 1726882652.77321: variable '__network_connections_result' from source: set_fact 15794 1726882652.77391: variable '__network_connections_result' from source: set_fact 15794 1726882652.77483: handler run complete 15794 1726882652.77507: attempt loop complete, returning result 15794 1726882652.77511: _execute() done 15794 1726882652.77513: dumping result to json 15794 1726882652.77519: done dumping result, returning 15794 1726882652.77530: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-94e5-e48f-00000000006f] 15794 1726882652.77538: sending task result for task 0affe814-3a2d-94e5-e48f-00000000006f 15794 1726882652.77645: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000006f 15794 1726882652.77648: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15794 1726882652.77749: no more pending results, returning what we have 15794 1726882652.77753: results queue empty 15794 1726882652.77754: checking for any_errors_fatal 15794 1726882652.77761: done checking for any_errors_fatal 15794 1726882652.77762: checking for max_fail_percentage 15794 1726882652.77763: done checking for max_fail_percentage 15794 1726882652.77765: checking to see if all hosts have failed and the running result is not ok 15794 1726882652.77765: done checking to see if all hosts have failed 15794 1726882652.77766: getting the remaining hosts for this loop 15794 1726882652.77768: done getting the remaining hosts for this loop 15794 1726882652.77773: getting the next task for host managed_node1 15794 1726882652.77781: done getting next task for host managed_node1 15794 1726882652.77785: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15794 1726882652.77787: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882652.77796: getting variables 15794 1726882652.77798: in VariableManager get_vars() 15794 1726882652.77831: Calling all_inventory to load vars for managed_node1 15794 1726882652.77890: Calling groups_inventory to load vars for managed_node1 15794 1726882652.77894: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882652.77904: Calling all_plugins_play to load vars for managed_node1 15794 1726882652.77908: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882652.77912: Calling groups_plugins_play to load vars for managed_node1 15794 1726882652.79603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882652.81267: done with get_vars() 15794 1726882652.81290: done getting variables 15794 1726882652.81343: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:32 -0400 (0:00:00.057) 0:00:50.371 ****** 15794 1726882652.81371: entering _queue_task() for managed_node1/debug 15794 1726882652.81591: worker is 1 (out of 1 available) 15794 1726882652.81605: exiting _queue_task() for managed_node1/debug 15794 1726882652.81618: done queuing things up, now waiting for results queue to drain 15794 1726882652.81620: waiting for pending results... 15794 1726882652.81857: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15794 1726882652.81905: in run() - task 0affe814-3a2d-94e5-e48f-000000000070 15794 1726882652.81920: variable 'ansible_search_path' from source: unknown 15794 1726882652.81924: variable 'ansible_search_path' from source: unknown 15794 1726882652.81960: calling self._execute() 15794 1726882652.82044: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.82052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.82098: variable 'omit' from source: magic vars 15794 1726882652.82519: variable 'ansible_distribution_major_version' from source: facts 15794 1726882652.82523: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882652.82652: variable 'network_state' from source: role '' defaults 15794 1726882652.82665: Evaluated conditional (network_state != {}): False 15794 1726882652.82669: when evaluation is False, skipping this task 15794 1726882652.82672: _execute() done 15794 1726882652.82676: dumping result to json 15794 1726882652.82679: done dumping result, returning 15794 1726882652.82939: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-94e5-e48f-000000000070] 15794 1726882652.82944: sending task result for task 0affe814-3a2d-94e5-e48f-000000000070 15794 1726882652.83012: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000070 15794 1726882652.83015: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15794 1726882652.83101: no more pending results, returning what we have 15794 1726882652.83105: results queue empty 15794 1726882652.83106: checking for any_errors_fatal 15794 1726882652.83112: done checking for any_errors_fatal 15794 1726882652.83112: checking for max_fail_percentage 15794 1726882652.83114: done checking for max_fail_percentage 15794 1726882652.83115: checking to see if all hosts have failed and the running result is not ok 15794 1726882652.83116: done checking to see if all hosts have failed 15794 1726882652.83117: getting the remaining hosts for this loop 15794 1726882652.83119: done getting the remaining hosts for this loop 15794 1726882652.83123: getting the next task for host managed_node1 15794 1726882652.83129: done getting next task for host managed_node1 15794 1726882652.83133: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15794 1726882652.83138: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882652.83151: getting variables 15794 1726882652.83153: in VariableManager get_vars() 15794 1726882652.83192: Calling all_inventory to load vars for managed_node1 15794 1726882652.83195: Calling groups_inventory to load vars for managed_node1 15794 1726882652.83198: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882652.83208: Calling all_plugins_play to load vars for managed_node1 15794 1726882652.83212: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882652.83216: Calling groups_plugins_play to load vars for managed_node1 15794 1726882652.90486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882652.93629: done with get_vars() 15794 1726882652.93667: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:32 -0400 (0:00:00.124) 0:00:50.495 ****** 15794 1726882652.93776: entering _queue_task() for managed_node1/ping 15794 1726882652.94192: worker is 1 (out of 1 available) 15794 1726882652.94206: exiting _queue_task() for managed_node1/ping 15794 1726882652.94220: done queuing things up, now waiting for results queue to drain 15794 1726882652.94221: waiting for pending results... 15794 1726882652.94506: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15794 1726882652.94627: in run() - task 0affe814-3a2d-94e5-e48f-000000000071 15794 1726882652.94636: variable 'ansible_search_path' from source: unknown 15794 1726882652.94640: variable 'ansible_search_path' from source: unknown 15794 1726882652.94686: calling self._execute() 15794 1726882652.94794: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.94804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.94817: variable 'omit' from source: magic vars 15794 1726882652.95272: variable 'ansible_distribution_major_version' from source: facts 15794 1726882652.95289: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882652.95293: variable 'omit' from source: magic vars 15794 1726882652.95343: variable 'omit' from source: magic vars 15794 1726882652.95391: variable 'omit' from source: magic vars 15794 1726882652.95436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882652.95477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882652.95500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882652.95522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.95542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882652.95585: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882652.95588: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.95609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.95719: Set connection var ansible_connection to ssh 15794 1726882652.95728: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882652.95737: Set connection var ansible_pipelining to False 15794 1726882652.95746: Set connection var ansible_shell_executable to /bin/sh 15794 1726882652.95755: Set connection var ansible_shell_type to sh 15794 1726882652.95768: Set connection var ansible_timeout to 10 15794 1726882652.95828: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.95834: variable 'ansible_connection' from source: unknown 15794 1726882652.95840: variable 'ansible_module_compression' from source: unknown 15794 1726882652.95844: variable 'ansible_shell_type' from source: unknown 15794 1726882652.95847: variable 'ansible_shell_executable' from source: unknown 15794 1726882652.95849: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882652.95851: variable 'ansible_pipelining' from source: unknown 15794 1726882652.95853: variable 'ansible_timeout' from source: unknown 15794 1726882652.95855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882652.96141: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882652.96154: variable 'omit' from source: magic vars 15794 1726882652.96159: starting attempt loop 15794 1726882652.96162: running the handler 15794 1726882652.96164: _low_level_execute_command(): starting 15794 1726882652.96167: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882652.96960: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882652.97018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882652.97021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882652.97061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882652.97161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882652.98919: stdout chunk (state=3): >>>/root <<< 15794 1726882652.99109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882652.99140: stdout chunk (state=3): >>><<< 15794 1726882652.99144: stderr chunk (state=3): >>><<< 15794 1726882652.99164: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882652.99269: _low_level_execute_command(): starting 15794 1726882652.99273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593 `" && echo ansible-tmp-1726882652.9917126-17682-139506595583593="` echo /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593 `" ) && sleep 0' 15794 1726882652.99874: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882652.99889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882652.99905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882652.99924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882652.99979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882652.99994: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.00014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.00078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882653.00091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.00171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.02138: stdout chunk (state=3): >>>ansible-tmp-1726882652.9917126-17682-139506595583593=/root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593 <<< 15794 1726882653.02260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.02313: stderr chunk (state=3): >>><<< 15794 1726882653.02316: stdout chunk (state=3): >>><<< 15794 1726882653.02342: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882652.9917126-17682-139506595583593=/root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882653.02382: variable 'ansible_module_compression' from source: unknown 15794 1726882653.02414: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15794 1726882653.02457: variable 'ansible_facts' from source: unknown 15794 1726882653.02517: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py 15794 1726882653.02628: Sending initial data 15794 1726882653.02632: Sent initial data (153 bytes) 15794 1726882653.03139: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882653.03145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882653.03161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.03255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.03296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.04912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882653.04969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882653.05031: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpppubk4gs /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py <<< 15794 1726882653.05034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py" <<< 15794 1726882653.05092: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpppubk4gs" to remote "/root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py" <<< 15794 1726882653.05924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.05999: stderr chunk (state=3): >>><<< 15794 1726882653.06004: stdout chunk (state=3): >>><<< 15794 1726882653.06018: done transferring module to remote 15794 1726882653.06027: _low_level_execute_command(): starting 15794 1726882653.06033: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/ /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py && sleep 0' 15794 1726882653.06478: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.06484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.06488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.06491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.06548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882653.06554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.06607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.08455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.08512: stderr chunk (state=3): >>><<< 15794 1726882653.08515: stdout chunk (state=3): >>><<< 15794 1726882653.08535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882653.08539: _low_level_execute_command(): starting 15794 1726882653.08547: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/AnsiballZ_ping.py && sleep 0' 15794 1726882653.08974: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.08982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882653.08985: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.09030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882653.09039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.09103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.25903: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15794 1726882653.27501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882653.27506: stdout chunk (state=3): >>><<< 15794 1726882653.27509: stderr chunk (state=3): >>><<< 15794 1726882653.27512: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882653.27515: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882653.27517: _low_level_execute_command(): starting 15794 1726882653.27520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882652.9917126-17682-139506595583593/ > /dev/null 2>&1 && sleep 0' 15794 1726882653.27987: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.27991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.27993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.27996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.28043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882653.28062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.28112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.30100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.30103: stdout chunk (state=3): >>><<< 15794 1726882653.30339: stderr chunk (state=3): >>><<< 15794 1726882653.30344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882653.30353: handler run complete 15794 1726882653.30356: attempt loop complete, returning result 15794 1726882653.30358: _execute() done 15794 1726882653.30360: dumping result to json 15794 1726882653.30364: done dumping result, returning 15794 1726882653.30367: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-94e5-e48f-000000000071] 15794 1726882653.30369: sending task result for task 0affe814-3a2d-94e5-e48f-000000000071 15794 1726882653.30441: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000071 15794 1726882653.30444: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15794 1726882653.30529: no more pending results, returning what we have 15794 1726882653.30533: results queue empty 15794 1726882653.30537: checking for any_errors_fatal 15794 1726882653.30548: done checking for any_errors_fatal 15794 1726882653.30549: checking for max_fail_percentage 15794 1726882653.30551: done checking for max_fail_percentage 15794 1726882653.30552: checking to see if all hosts have failed and the running result is not ok 15794 1726882653.30562: done checking to see if all hosts have failed 15794 1726882653.30564: getting the remaining hosts for this loop 15794 1726882653.30567: done getting the remaining hosts for this loop 15794 1726882653.30572: getting the next task for host managed_node1 15794 1726882653.30582: done getting next task for host managed_node1 15794 1726882653.30585: ^ task is: TASK: meta (role_complete) 15794 1726882653.30587: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882653.30600: getting variables 15794 1726882653.30602: in VariableManager get_vars() 15794 1726882653.30799: Calling all_inventory to load vars for managed_node1 15794 1726882653.30803: Calling groups_inventory to load vars for managed_node1 15794 1726882653.30806: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882653.30818: Calling all_plugins_play to load vars for managed_node1 15794 1726882653.30822: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882653.30826: Calling groups_plugins_play to load vars for managed_node1 15794 1726882653.33848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882653.38275: done with get_vars() 15794 1726882653.38317: done getting variables 15794 1726882653.38513: done queuing things up, now waiting for results queue to drain 15794 1726882653.38516: results queue empty 15794 1726882653.38517: checking for any_errors_fatal 15794 1726882653.38651: done checking for any_errors_fatal 15794 1726882653.38652: checking for max_fail_percentage 15794 1726882653.38654: done checking for max_fail_percentage 15794 1726882653.38655: checking to see if all hosts have failed and the running result is not ok 15794 1726882653.38656: done checking to see if all hosts have failed 15794 1726882653.38657: getting the remaining hosts for this loop 15794 1726882653.38658: done getting the remaining hosts for this loop 15794 1726882653.38662: getting the next task for host managed_node1 15794 1726882653.38670: done getting next task for host managed_node1 15794 1726882653.38673: ^ task is: TASK: meta (flush_handlers) 15794 1726882653.38675: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882653.38678: getting variables 15794 1726882653.38679: in VariableManager get_vars() 15794 1726882653.38694: Calling all_inventory to load vars for managed_node1 15794 1726882653.38697: Calling groups_inventory to load vars for managed_node1 15794 1726882653.38700: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882653.38706: Calling all_plugins_play to load vars for managed_node1 15794 1726882653.38709: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882653.38713: Calling groups_plugins_play to load vars for managed_node1 15794 1726882653.42766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882653.45807: done with get_vars() 15794 1726882653.45846: done getting variables 15794 1726882653.46021: in VariableManager get_vars() 15794 1726882653.46057: Calling all_inventory to load vars for managed_node1 15794 1726882653.46060: Calling groups_inventory to load vars for managed_node1 15794 1726882653.46063: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882653.46069: Calling all_plugins_play to load vars for managed_node1 15794 1726882653.46073: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882653.46077: Calling groups_plugins_play to load vars for managed_node1 15794 1726882653.48143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882653.52797: done with get_vars() 15794 1726882653.53046: done queuing things up, now waiting for results queue to drain 15794 1726882653.53049: results queue empty 15794 1726882653.53050: checking for any_errors_fatal 15794 1726882653.53052: done checking for any_errors_fatal 15794 1726882653.53053: checking for max_fail_percentage 15794 1726882653.53055: done checking for max_fail_percentage 15794 1726882653.53056: checking to see if all hosts have failed and the running result is not ok 15794 1726882653.53057: done checking to see if all hosts have failed 15794 1726882653.53058: getting the remaining hosts for this loop 15794 1726882653.53059: done getting the remaining hosts for this loop 15794 1726882653.53063: getting the next task for host managed_node1 15794 1726882653.53068: done getting next task for host managed_node1 15794 1726882653.53070: ^ task is: TASK: meta (flush_handlers) 15794 1726882653.53072: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882653.53075: getting variables 15794 1726882653.53077: in VariableManager get_vars() 15794 1726882653.53096: Calling all_inventory to load vars for managed_node1 15794 1726882653.53099: Calling groups_inventory to load vars for managed_node1 15794 1726882653.53102: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882653.53108: Calling all_plugins_play to load vars for managed_node1 15794 1726882653.53112: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882653.53116: Calling groups_plugins_play to load vars for managed_node1 15794 1726882653.56430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882653.60850: done with get_vars() 15794 1726882653.60891: done getting variables 15794 1726882653.60956: in VariableManager get_vars() 15794 1726882653.60971: Calling all_inventory to load vars for managed_node1 15794 1726882653.60974: Calling groups_inventory to load vars for managed_node1 15794 1726882653.60977: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882653.60986: Calling all_plugins_play to load vars for managed_node1 15794 1726882653.60989: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882653.60993: Calling groups_plugins_play to load vars for managed_node1 15794 1726882653.63781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882653.69504: done with get_vars() 15794 1726882653.69713: done queuing things up, now waiting for results queue to drain 15794 1726882653.69716: results queue empty 15794 1726882653.69717: checking for any_errors_fatal 15794 1726882653.69719: done checking for any_errors_fatal 15794 1726882653.69720: checking for max_fail_percentage 15794 1726882653.69721: done checking for max_fail_percentage 15794 1726882653.69722: checking to see if all hosts have failed and the running result is not ok 15794 1726882653.69723: done checking to see if all hosts have failed 15794 1726882653.69724: getting the remaining hosts for this loop 15794 1726882653.69725: done getting the remaining hosts for this loop 15794 1726882653.69729: getting the next task for host managed_node1 15794 1726882653.69733: done getting next task for host managed_node1 15794 1726882653.69735: ^ task is: None 15794 1726882653.69737: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882653.69739: done queuing things up, now waiting for results queue to drain 15794 1726882653.69740: results queue empty 15794 1726882653.69741: checking for any_errors_fatal 15794 1726882653.69742: done checking for any_errors_fatal 15794 1726882653.69743: checking for max_fail_percentage 15794 1726882653.69744: done checking for max_fail_percentage 15794 1726882653.69745: checking to see if all hosts have failed and the running result is not ok 15794 1726882653.69746: done checking to see if all hosts have failed 15794 1726882653.69747: getting the next task for host managed_node1 15794 1726882653.69750: done getting next task for host managed_node1 15794 1726882653.69751: ^ task is: None 15794 1726882653.69753: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882653.69858: in VariableManager get_vars() 15794 1726882653.69880: done with get_vars() 15794 1726882653.69890: in VariableManager get_vars() 15794 1726882653.69905: done with get_vars() 15794 1726882653.69911: variable 'omit' from source: magic vars 15794 1726882653.70063: in VariableManager get_vars() 15794 1726882653.70076: done with get_vars() 15794 1726882653.70106: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 15794 1726882653.70658: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882653.70841: getting the remaining hosts for this loop 15794 1726882653.70843: done getting the remaining hosts for this loop 15794 1726882653.70846: getting the next task for host managed_node1 15794 1726882653.70850: done getting next task for host managed_node1 15794 1726882653.70852: ^ task is: TASK: Gathering Facts 15794 1726882653.70854: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882653.70857: getting variables 15794 1726882653.70861: in VariableManager get_vars() 15794 1726882653.70871: Calling all_inventory to load vars for managed_node1 15794 1726882653.70874: Calling groups_inventory to load vars for managed_node1 15794 1726882653.70877: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882653.70885: Calling all_plugins_play to load vars for managed_node1 15794 1726882653.70890: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882653.70894: Calling groups_plugins_play to load vars for managed_node1 15794 1726882653.73583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882653.76487: done with get_vars() 15794 1726882653.76516: done getting variables 15794 1726882653.76569: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 21:37:33 -0400 (0:00:00.828) 0:00:51.323 ****** 15794 1726882653.76599: entering _queue_task() for managed_node1/gather_facts 15794 1726882653.77176: worker is 1 (out of 1 available) 15794 1726882653.77191: exiting _queue_task() for managed_node1/gather_facts 15794 1726882653.77203: done queuing things up, now waiting for results queue to drain 15794 1726882653.77205: waiting for pending results... 15794 1726882653.77593: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882653.78140: in run() - task 0affe814-3a2d-94e5-e48f-0000000004e4 15794 1726882653.78144: variable 'ansible_search_path' from source: unknown 15794 1726882653.78150: calling self._execute() 15794 1726882653.78153: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882653.78156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882653.78286: variable 'omit' from source: magic vars 15794 1726882653.79209: variable 'ansible_distribution_major_version' from source: facts 15794 1726882653.79269: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882653.79440: variable 'omit' from source: magic vars 15794 1726882653.79444: variable 'omit' from source: magic vars 15794 1726882653.79458: variable 'omit' from source: magic vars 15794 1726882653.79595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882653.79695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882653.79727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882653.79819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882653.79840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882653.79885: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882653.80123: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882653.80127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882653.80268: Set connection var ansible_connection to ssh 15794 1726882653.80288: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882653.80304: Set connection var ansible_pipelining to False 15794 1726882653.80319: Set connection var ansible_shell_executable to /bin/sh 15794 1726882653.80328: Set connection var ansible_shell_type to sh 15794 1726882653.80452: Set connection var ansible_timeout to 10 15794 1726882653.80501: variable 'ansible_shell_executable' from source: unknown 15794 1726882653.80511: variable 'ansible_connection' from source: unknown 15794 1726882653.80520: variable 'ansible_module_compression' from source: unknown 15794 1726882653.80528: variable 'ansible_shell_type' from source: unknown 15794 1726882653.80538: variable 'ansible_shell_executable' from source: unknown 15794 1726882653.80547: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882653.80557: variable 'ansible_pipelining' from source: unknown 15794 1726882653.80569: variable 'ansible_timeout' from source: unknown 15794 1726882653.80582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882653.80987: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882653.81132: variable 'omit' from source: magic vars 15794 1726882653.81148: starting attempt loop 15794 1726882653.81156: running the handler 15794 1726882653.81182: variable 'ansible_facts' from source: unknown 15794 1726882653.81250: _low_level_execute_command(): starting 15794 1726882653.81442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882653.83093: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882653.83109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882653.83132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.83250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.83263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882653.83286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882653.83462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.83543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.85832: stdout chunk (state=3): >>>/root <<< 15794 1726882653.85838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.85841: stdout chunk (state=3): >>><<< 15794 1726882653.85844: stderr chunk (state=3): >>><<< 15794 1726882653.85849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882653.85851: _low_level_execute_command(): starting 15794 1726882653.85854: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510 `" && echo ansible-tmp-1726882653.8566804-17710-71317225639510="` echo /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510 `" ) && sleep 0' 15794 1726882653.86979: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.87051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.87173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882653.87270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.87324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.89318: stdout chunk (state=3): >>>ansible-tmp-1726882653.8566804-17710-71317225639510=/root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510 <<< 15794 1726882653.89724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.89728: stdout chunk (state=3): >>><<< 15794 1726882653.89731: stderr chunk (state=3): >>><<< 15794 1726882653.89737: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882653.8566804-17710-71317225639510=/root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882653.89753: variable 'ansible_module_compression' from source: unknown 15794 1726882653.89943: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882653.89999: variable 'ansible_facts' from source: unknown 15794 1726882653.90582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py 15794 1726882653.90910: Sending initial data 15794 1726882653.90921: Sent initial data (153 bytes) 15794 1726882653.91429: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882653.91452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882653.91472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882653.91554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882653.91609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882653.91626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882653.91657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882653.91777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882653.93404: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882653.93458: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882653.93541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpb_j47o2y /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py <<< 15794 1726882653.93545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py" <<< 15794 1726882653.93740: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpb_j47o2y" to remote "/root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py" <<< 15794 1726882653.98492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882653.98521: stdout chunk (state=3): >>><<< 15794 1726882653.98549: stderr chunk (state=3): >>><<< 15794 1726882653.98839: done transferring module to remote 15794 1726882653.98843: _low_level_execute_command(): starting 15794 1726882653.98846: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/ /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py && sleep 0' 15794 1726882653.99902: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882653.99916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882653.99928: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882654.00068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882654.00081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882654.00163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882654.02149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882654.02309: stderr chunk (state=3): >>><<< 15794 1726882654.02313: stdout chunk (state=3): >>><<< 15794 1726882654.02341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882654.02353: _low_level_execute_command(): starting 15794 1726882654.02365: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/AnsiballZ_setup.py && sleep 0' 15794 1726882654.03115: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882654.03136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882654.03154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882654.03269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882654.03293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882654.03390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882654.73487: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.50439453125, "5m": 0.443359375, "15m": 0.224609375}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.<<< 15794 1726882654.73541: stdout chunk (state=3): >>>local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2854, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 863, "free": 2854}, "nocache": {"free": 3459, "used": 258}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 608, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205160960, "block_size": 4096, "block_total": 64483404, "block_available": 61329385, "block_used": 3154019, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "34", "epoch": "1726882654", "epoch_int": "1726882654", "date": "2024-09-20", "time": "21:37:34", "iso8601_micro": "2024-09-21T01:37:34.703884Z", "iso8601": "2024-09-21T01:37:34Z", "iso8601_basic": "20240920T213734703884", "iso8601_basic_short": "20240920T213734", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882654.75740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882654.75838: stderr chunk (state=3): >>><<< 15794 1726882654.75875: stdout chunk (state=3): >>><<< 15794 1726882654.76140: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.50439453125, "5m": 0.443359375, "15m": 0.224609375}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2854, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 863, "free": 2854}, "nocache": {"free": 3459, "used": 258}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 608, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205160960, "block_size": 4096, "block_total": 64483404, "block_available": 61329385, "block_used": 3154019, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "34", "epoch": "1726882654", "epoch_int": "1726882654", "date": "2024-09-20", "time": "21:37:34", "iso8601_micro": "2024-09-21T01:37:34.703884Z", "iso8601": "2024-09-21T01:37:34Z", "iso8601_basic": "20240920T213734703884", "iso8601_basic_short": "20240920T213734", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882654.76937: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882654.77152: _low_level_execute_command(): starting 15794 1726882654.77163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882653.8566804-17710-71317225639510/ > /dev/null 2>&1 && sleep 0' 15794 1726882654.78903: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882654.79052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882654.79068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882654.79457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882654.79759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882654.81753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882654.82039: stderr chunk (state=3): >>><<< 15794 1726882654.82047: stdout chunk (state=3): >>><<< 15794 1726882654.82050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882654.82053: handler run complete 15794 1726882654.82569: variable 'ansible_facts' from source: unknown 15794 1726882654.83070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882654.84205: variable 'ansible_facts' from source: unknown 15794 1726882654.84429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882654.84985: attempt loop complete, returning result 15794 1726882654.85032: _execute() done 15794 1726882654.85236: dumping result to json 15794 1726882654.85240: done dumping result, returning 15794 1726882654.85242: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-0000000004e4] 15794 1726882654.85244: sending task result for task 0affe814-3a2d-94e5-e48f-0000000004e4 ok: [managed_node1] 15794 1726882654.87010: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000004e4 15794 1726882654.87014: WORKER PROCESS EXITING 15794 1726882654.87216: no more pending results, returning what we have 15794 1726882654.87220: results queue empty 15794 1726882654.87221: checking for any_errors_fatal 15794 1726882654.87223: done checking for any_errors_fatal 15794 1726882654.87224: checking for max_fail_percentage 15794 1726882654.87225: done checking for max_fail_percentage 15794 1726882654.87226: checking to see if all hosts have failed and the running result is not ok 15794 1726882654.87228: done checking to see if all hosts have failed 15794 1726882654.87228: getting the remaining hosts for this loop 15794 1726882654.87230: done getting the remaining hosts for this loop 15794 1726882654.87338: getting the next task for host managed_node1 15794 1726882654.87345: done getting next task for host managed_node1 15794 1726882654.87348: ^ task is: TASK: meta (flush_handlers) 15794 1726882654.87352: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882654.87361: getting variables 15794 1726882654.87362: in VariableManager get_vars() 15794 1726882654.87389: Calling all_inventory to load vars for managed_node1 15794 1726882654.87393: Calling groups_inventory to load vars for managed_node1 15794 1726882654.87397: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882654.87409: Calling all_plugins_play to load vars for managed_node1 15794 1726882654.87413: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882654.87417: Calling groups_plugins_play to load vars for managed_node1 15794 1726882654.93257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.01019: done with get_vars() 15794 1726882655.01091: done getting variables 15794 1726882655.01235: in VariableManager get_vars() 15794 1726882655.01249: Calling all_inventory to load vars for managed_node1 15794 1726882655.01252: Calling groups_inventory to load vars for managed_node1 15794 1726882655.01255: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.01261: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.01265: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.01268: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.04163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.07522: done with get_vars() 15794 1726882655.07581: done queuing things up, now waiting for results queue to drain 15794 1726882655.07584: results queue empty 15794 1726882655.07585: checking for any_errors_fatal 15794 1726882655.07592: done checking for any_errors_fatal 15794 1726882655.07593: checking for max_fail_percentage 15794 1726882655.07594: done checking for max_fail_percentage 15794 1726882655.07595: checking to see if all hosts have failed and the running result is not ok 15794 1726882655.07596: done checking to see if all hosts have failed 15794 1726882655.07603: getting the remaining hosts for this loop 15794 1726882655.07604: done getting the remaining hosts for this loop 15794 1726882655.07607: getting the next task for host managed_node1 15794 1726882655.07613: done getting next task for host managed_node1 15794 1726882655.07616: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 15794 1726882655.07618: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882655.07621: getting variables 15794 1726882655.07622: in VariableManager get_vars() 15794 1726882655.07640: Calling all_inventory to load vars for managed_node1 15794 1726882655.07643: Calling groups_inventory to load vars for managed_node1 15794 1726882655.07646: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.07654: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.07657: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.07660: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.09815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.14860: done with get_vars() 15794 1726882655.14909: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 21:37:35 -0400 (0:00:01.384) 0:00:52.708 ****** 15794 1726882655.15014: entering _queue_task() for managed_node1/include_tasks 15794 1726882655.15425: worker is 1 (out of 1 available) 15794 1726882655.15443: exiting _queue_task() for managed_node1/include_tasks 15794 1726882655.15459: done queuing things up, now waiting for results queue to drain 15794 1726882655.15460: waiting for pending results... 15794 1726882655.15740: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 15794 1726882655.15879: in run() - task 0affe814-3a2d-94e5-e48f-000000000074 15794 1726882655.15971: variable 'ansible_search_path' from source: unknown 15794 1726882655.15975: calling self._execute() 15794 1726882655.16076: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.16094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.16111: variable 'omit' from source: magic vars 15794 1726882655.16571: variable 'ansible_distribution_major_version' from source: facts 15794 1726882655.16591: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882655.16602: _execute() done 15794 1726882655.16609: dumping result to json 15794 1726882655.16625: done dumping result, returning 15794 1726882655.16640: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [0affe814-3a2d-94e5-e48f-000000000074] 15794 1726882655.16651: sending task result for task 0affe814-3a2d-94e5-e48f-000000000074 15794 1726882655.16864: no more pending results, returning what we have 15794 1726882655.16872: in VariableManager get_vars() 15794 1726882655.16910: Calling all_inventory to load vars for managed_node1 15794 1726882655.16914: Calling groups_inventory to load vars for managed_node1 15794 1726882655.16919: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.17039: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.17046: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.17052: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.17672: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000074 15794 1726882655.17676: WORKER PROCESS EXITING 15794 1726882655.19449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.22395: done with get_vars() 15794 1726882655.22425: variable 'ansible_search_path' from source: unknown 15794 1726882655.22443: we have included files to process 15794 1726882655.22444: generating all_blocks data 15794 1726882655.22446: done generating all_blocks data 15794 1726882655.22447: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15794 1726882655.22448: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15794 1726882655.22451: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15794 1726882655.22655: in VariableManager get_vars() 15794 1726882655.22674: done with get_vars() 15794 1726882655.22821: done processing included file 15794 1726882655.22824: iterating over new_blocks loaded from include file 15794 1726882655.22826: in VariableManager get_vars() 15794 1726882655.22843: done with get_vars() 15794 1726882655.22845: filtering new block on tags 15794 1726882655.22866: done filtering new block on tags 15794 1726882655.22869: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 15794 1726882655.22875: extending task lists for all hosts with included blocks 15794 1726882655.22947: done extending task lists 15794 1726882655.22949: done processing included files 15794 1726882655.22950: results queue empty 15794 1726882655.22951: checking for any_errors_fatal 15794 1726882655.22953: done checking for any_errors_fatal 15794 1726882655.22954: checking for max_fail_percentage 15794 1726882655.22955: done checking for max_fail_percentage 15794 1726882655.22956: checking to see if all hosts have failed and the running result is not ok 15794 1726882655.22957: done checking to see if all hosts have failed 15794 1726882655.22958: getting the remaining hosts for this loop 15794 1726882655.22960: done getting the remaining hosts for this loop 15794 1726882655.22963: getting the next task for host managed_node1 15794 1726882655.22967: done getting next task for host managed_node1 15794 1726882655.22970: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15794 1726882655.22973: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882655.22975: getting variables 15794 1726882655.22976: in VariableManager get_vars() 15794 1726882655.22986: Calling all_inventory to load vars for managed_node1 15794 1726882655.22989: Calling groups_inventory to load vars for managed_node1 15794 1726882655.22992: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.22998: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.23002: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.23006: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.25339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.28540: done with get_vars() 15794 1726882655.28575: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:37:35 -0400 (0:00:00.136) 0:00:52.844 ****** 15794 1726882655.28676: entering _queue_task() for managed_node1/include_tasks 15794 1726882655.29192: worker is 1 (out of 1 available) 15794 1726882655.29206: exiting _queue_task() for managed_node1/include_tasks 15794 1726882655.29219: done queuing things up, now waiting for results queue to drain 15794 1726882655.29221: waiting for pending results... 15794 1726882655.29925: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15794 1726882655.30062: in run() - task 0affe814-3a2d-94e5-e48f-0000000004f5 15794 1726882655.30091: variable 'ansible_search_path' from source: unknown 15794 1726882655.30101: variable 'ansible_search_path' from source: unknown 15794 1726882655.30167: calling self._execute() 15794 1726882655.30295: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.30313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.30343: variable 'omit' from source: magic vars 15794 1726882655.30847: variable 'ansible_distribution_major_version' from source: facts 15794 1726882655.30870: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882655.30898: _execute() done 15794 1726882655.30993: dumping result to json 15794 1726882655.30998: done dumping result, returning 15794 1726882655.31001: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-94e5-e48f-0000000004f5] 15794 1726882655.31004: sending task result for task 0affe814-3a2d-94e5-e48f-0000000004f5 15794 1726882655.31082: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000004f5 15794 1726882655.31086: WORKER PROCESS EXITING 15794 1726882655.31127: no more pending results, returning what we have 15794 1726882655.31137: in VariableManager get_vars() 15794 1726882655.31177: Calling all_inventory to load vars for managed_node1 15794 1726882655.31184: Calling groups_inventory to load vars for managed_node1 15794 1726882655.31189: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.31210: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.31215: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.31220: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.33226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.34795: done with get_vars() 15794 1726882655.34814: variable 'ansible_search_path' from source: unknown 15794 1726882655.34815: variable 'ansible_search_path' from source: unknown 15794 1726882655.34848: we have included files to process 15794 1726882655.34849: generating all_blocks data 15794 1726882655.34850: done generating all_blocks data 15794 1726882655.34851: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15794 1726882655.34852: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15794 1726882655.34854: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15794 1726882655.36082: done processing included file 15794 1726882655.36084: iterating over new_blocks loaded from include file 15794 1726882655.36085: in VariableManager get_vars() 15794 1726882655.36102: done with get_vars() 15794 1726882655.36104: filtering new block on tags 15794 1726882655.36132: done filtering new block on tags 15794 1726882655.36138: in VariableManager get_vars() 15794 1726882655.36154: done with get_vars() 15794 1726882655.36156: filtering new block on tags 15794 1726882655.36182: done filtering new block on tags 15794 1726882655.36185: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15794 1726882655.36190: extending task lists for all hosts with included blocks 15794 1726882655.36327: done extending task lists 15794 1726882655.36329: done processing included files 15794 1726882655.36330: results queue empty 15794 1726882655.36331: checking for any_errors_fatal 15794 1726882655.36337: done checking for any_errors_fatal 15794 1726882655.36338: checking for max_fail_percentage 15794 1726882655.36339: done checking for max_fail_percentage 15794 1726882655.36340: checking to see if all hosts have failed and the running result is not ok 15794 1726882655.36341: done checking to see if all hosts have failed 15794 1726882655.36342: getting the remaining hosts for this loop 15794 1726882655.36344: done getting the remaining hosts for this loop 15794 1726882655.36347: getting the next task for host managed_node1 15794 1726882655.36352: done getting next task for host managed_node1 15794 1726882655.36354: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15794 1726882655.36358: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882655.36361: getting variables 15794 1726882655.36362: in VariableManager get_vars() 15794 1726882655.36442: Calling all_inventory to load vars for managed_node1 15794 1726882655.36446: Calling groups_inventory to load vars for managed_node1 15794 1726882655.36449: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.36455: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.36458: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.36462: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.38374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.41659: done with get_vars() 15794 1726882655.41696: done getting variables 15794 1726882655.41956: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:37:35 -0400 (0:00:00.133) 0:00:52.977 ****** 15794 1726882655.41993: entering _queue_task() for managed_node1/set_fact 15794 1726882655.42483: worker is 1 (out of 1 available) 15794 1726882655.42498: exiting _queue_task() for managed_node1/set_fact 15794 1726882655.42512: done queuing things up, now waiting for results queue to drain 15794 1726882655.42513: waiting for pending results... 15794 1726882655.42875: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15794 1726882655.43042: in run() - task 0affe814-3a2d-94e5-e48f-000000000502 15794 1726882655.43052: variable 'ansible_search_path' from source: unknown 15794 1726882655.43055: variable 'ansible_search_path' from source: unknown 15794 1726882655.43085: calling self._execute() 15794 1726882655.43439: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.43443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.43446: variable 'omit' from source: magic vars 15794 1726882655.43668: variable 'ansible_distribution_major_version' from source: facts 15794 1726882655.43693: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882655.43706: variable 'omit' from source: magic vars 15794 1726882655.43768: variable 'omit' from source: magic vars 15794 1726882655.43833: variable 'omit' from source: magic vars 15794 1726882655.43889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882655.43968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882655.43998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882655.44031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882655.44053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882655.44442: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882655.44445: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.44447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.44463: Set connection var ansible_connection to ssh 15794 1726882655.44477: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882655.44490: Set connection var ansible_pipelining to False 15794 1726882655.44502: Set connection var ansible_shell_executable to /bin/sh 15794 1726882655.44557: Set connection var ansible_shell_type to sh 15794 1726882655.44574: Set connection var ansible_timeout to 10 15794 1726882655.44612: variable 'ansible_shell_executable' from source: unknown 15794 1726882655.44664: variable 'ansible_connection' from source: unknown 15794 1726882655.44674: variable 'ansible_module_compression' from source: unknown 15794 1726882655.44684: variable 'ansible_shell_type' from source: unknown 15794 1726882655.44693: variable 'ansible_shell_executable' from source: unknown 15794 1726882655.44702: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.44711: variable 'ansible_pipelining' from source: unknown 15794 1726882655.44720: variable 'ansible_timeout' from source: unknown 15794 1726882655.44730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.44913: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882655.44935: variable 'omit' from source: magic vars 15794 1726882655.44949: starting attempt loop 15794 1726882655.44957: running the handler 15794 1726882655.44977: handler run complete 15794 1726882655.44997: attempt loop complete, returning result 15794 1726882655.45005: _execute() done 15794 1726882655.45011: dumping result to json 15794 1726882655.45019: done dumping result, returning 15794 1726882655.45031: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-94e5-e48f-000000000502] 15794 1726882655.45046: sending task result for task 0affe814-3a2d-94e5-e48f-000000000502 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15794 1726882655.45307: no more pending results, returning what we have 15794 1726882655.45312: results queue empty 15794 1726882655.45313: checking for any_errors_fatal 15794 1726882655.45315: done checking for any_errors_fatal 15794 1726882655.45316: checking for max_fail_percentage 15794 1726882655.45318: done checking for max_fail_percentage 15794 1726882655.45319: checking to see if all hosts have failed and the running result is not ok 15794 1726882655.45320: done checking to see if all hosts have failed 15794 1726882655.45321: getting the remaining hosts for this loop 15794 1726882655.45323: done getting the remaining hosts for this loop 15794 1726882655.45328: getting the next task for host managed_node1 15794 1726882655.45339: done getting next task for host managed_node1 15794 1726882655.45342: ^ task is: TASK: Stat profile file 15794 1726882655.45348: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882655.45354: getting variables 15794 1726882655.45356: in VariableManager get_vars() 15794 1726882655.45389: Calling all_inventory to load vars for managed_node1 15794 1726882655.45440: Calling groups_inventory to load vars for managed_node1 15794 1726882655.45446: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.45454: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000502 15794 1726882655.45457: WORKER PROCESS EXITING 15794 1726882655.45468: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.45472: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.45476: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.48639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.51631: done with get_vars() 15794 1726882655.51673: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:37:35 -0400 (0:00:00.098) 0:00:53.076 ****** 15794 1726882655.51809: entering _queue_task() for managed_node1/stat 15794 1726882655.52231: worker is 1 (out of 1 available) 15794 1726882655.52448: exiting _queue_task() for managed_node1/stat 15794 1726882655.52460: done queuing things up, now waiting for results queue to drain 15794 1726882655.52461: waiting for pending results... 15794 1726882655.52544: running TaskExecutor() for managed_node1/TASK: Stat profile file 15794 1726882655.52698: in run() - task 0affe814-3a2d-94e5-e48f-000000000503 15794 1726882655.52720: variable 'ansible_search_path' from source: unknown 15794 1726882655.52729: variable 'ansible_search_path' from source: unknown 15794 1726882655.52776: calling self._execute() 15794 1726882655.52890: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.52913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.52930: variable 'omit' from source: magic vars 15794 1726882655.53943: variable 'ansible_distribution_major_version' from source: facts 15794 1726882655.53949: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882655.53953: variable 'omit' from source: magic vars 15794 1726882655.53956: variable 'omit' from source: magic vars 15794 1726882655.54140: variable 'profile' from source: include params 15794 1726882655.54144: variable 'interface' from source: set_fact 15794 1726882655.54146: variable 'interface' from source: set_fact 15794 1726882655.54149: variable 'omit' from source: magic vars 15794 1726882655.54167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882655.54218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882655.54241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882655.54266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882655.54276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882655.54317: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882655.54321: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.54326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.54485: Set connection var ansible_connection to ssh 15794 1726882655.54489: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882655.54492: Set connection var ansible_pipelining to False 15794 1726882655.54494: Set connection var ansible_shell_executable to /bin/sh 15794 1726882655.54496: Set connection var ansible_shell_type to sh 15794 1726882655.54499: Set connection var ansible_timeout to 10 15794 1726882655.54527: variable 'ansible_shell_executable' from source: unknown 15794 1726882655.54531: variable 'ansible_connection' from source: unknown 15794 1726882655.54535: variable 'ansible_module_compression' from source: unknown 15794 1726882655.54540: variable 'ansible_shell_type' from source: unknown 15794 1726882655.54542: variable 'ansible_shell_executable' from source: unknown 15794 1726882655.54548: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.54554: variable 'ansible_pipelining' from source: unknown 15794 1726882655.54557: variable 'ansible_timeout' from source: unknown 15794 1726882655.54564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.55031: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882655.55037: variable 'omit' from source: magic vars 15794 1726882655.55040: starting attempt loop 15794 1726882655.55043: running the handler 15794 1726882655.55045: _low_level_execute_command(): starting 15794 1726882655.55048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882655.55652: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.55721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882655.55745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882655.55764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882655.55853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882655.57614: stdout chunk (state=3): >>>/root <<< 15794 1726882655.57947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882655.57951: stdout chunk (state=3): >>><<< 15794 1726882655.57953: stderr chunk (state=3): >>><<< 15794 1726882655.57956: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882655.57959: _low_level_execute_command(): starting 15794 1726882655.57962: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449 `" && echo ansible-tmp-1726882655.5783317-17776-109319893570449="` echo /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449 `" ) && sleep 0' 15794 1726882655.58448: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882655.58459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882655.58471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.58488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882655.58501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882655.58512: stderr chunk (state=3): >>>debug2: match not found <<< 15794 1726882655.58519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.58536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15794 1726882655.58546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882655.58554: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15794 1726882655.58564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882655.58574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.58589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882655.58599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882655.58605: stderr chunk (state=3): >>>debug2: match found <<< 15794 1726882655.58616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.58695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882655.58710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882655.58726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882655.58826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882655.60819: stdout chunk (state=3): >>>ansible-tmp-1726882655.5783317-17776-109319893570449=/root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449 <<< 15794 1726882655.61029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882655.61032: stdout chunk (state=3): >>><<< 15794 1726882655.61037: stderr chunk (state=3): >>><<< 15794 1726882655.61240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882655.5783317-17776-109319893570449=/root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882655.61244: variable 'ansible_module_compression' from source: unknown 15794 1726882655.61247: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15794 1726882655.61250: variable 'ansible_facts' from source: unknown 15794 1726882655.61338: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py 15794 1726882655.61614: Sending initial data 15794 1726882655.61618: Sent initial data (153 bytes) 15794 1726882655.62214: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882655.62231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882655.62259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.62280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882655.62369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.62410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882655.62429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882655.62461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882655.62549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882655.64220: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882655.64263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882655.64337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp6p3qm7b4 /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py <<< 15794 1726882655.64340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py" <<< 15794 1726882655.64380: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp6p3qm7b4" to remote "/root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py" <<< 15794 1726882655.65523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882655.65740: stderr chunk (state=3): >>><<< 15794 1726882655.65745: stdout chunk (state=3): >>><<< 15794 1726882655.65747: done transferring module to remote 15794 1726882655.65749: _low_level_execute_command(): starting 15794 1726882655.65751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/ /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py && sleep 0' 15794 1726882655.66377: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882655.66398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882655.66422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.66524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.66568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882655.66594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882655.66680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882655.68660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882655.68678: stdout chunk (state=3): >>><<< 15794 1726882655.68691: stderr chunk (state=3): >>><<< 15794 1726882655.68712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882655.68722: _low_level_execute_command(): starting 15794 1726882655.68732: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/AnsiballZ_stat.py && sleep 0' 15794 1726882655.69369: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882655.69394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882655.69410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.69427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882655.69507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.69552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882655.69569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882655.69590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882655.69694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882655.86701: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15794 1726882655.88090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882655.88150: stderr chunk (state=3): >>><<< 15794 1726882655.88154: stdout chunk (state=3): >>><<< 15794 1726882655.88168: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882655.88198: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882655.88208: _low_level_execute_command(): starting 15794 1726882655.88219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882655.5783317-17776-109319893570449/ > /dev/null 2>&1 && sleep 0' 15794 1726882655.88691: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.88694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.88701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882655.88703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882655.88751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882655.88758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882655.88820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882655.90721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882655.90768: stderr chunk (state=3): >>><<< 15794 1726882655.90771: stdout chunk (state=3): >>><<< 15794 1726882655.90787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882655.90793: handler run complete 15794 1726882655.90814: attempt loop complete, returning result 15794 1726882655.90817: _execute() done 15794 1726882655.90819: dumping result to json 15794 1726882655.90825: done dumping result, returning 15794 1726882655.90833: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affe814-3a2d-94e5-e48f-000000000503] 15794 1726882655.90840: sending task result for task 0affe814-3a2d-94e5-e48f-000000000503 15794 1726882655.90949: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000503 15794 1726882655.90951: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15794 1726882655.91018: no more pending results, returning what we have 15794 1726882655.91023: results queue empty 15794 1726882655.91024: checking for any_errors_fatal 15794 1726882655.91033: done checking for any_errors_fatal 15794 1726882655.91036: checking for max_fail_percentage 15794 1726882655.91038: done checking for max_fail_percentage 15794 1726882655.91039: checking to see if all hosts have failed and the running result is not ok 15794 1726882655.91040: done checking to see if all hosts have failed 15794 1726882655.91041: getting the remaining hosts for this loop 15794 1726882655.91043: done getting the remaining hosts for this loop 15794 1726882655.91047: getting the next task for host managed_node1 15794 1726882655.91055: done getting next task for host managed_node1 15794 1726882655.91058: ^ task is: TASK: Set NM profile exist flag based on the profile files 15794 1726882655.91062: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882655.91066: getting variables 15794 1726882655.91067: in VariableManager get_vars() 15794 1726882655.91099: Calling all_inventory to load vars for managed_node1 15794 1726882655.91103: Calling groups_inventory to load vars for managed_node1 15794 1726882655.91107: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.91118: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.91121: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.91124: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.92367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882655.93935: done with get_vars() 15794 1726882655.93958: done getting variables 15794 1726882655.94009: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:37:35 -0400 (0:00:00.422) 0:00:53.498 ****** 15794 1726882655.94036: entering _queue_task() for managed_node1/set_fact 15794 1726882655.94267: worker is 1 (out of 1 available) 15794 1726882655.94281: exiting _queue_task() for managed_node1/set_fact 15794 1726882655.94293: done queuing things up, now waiting for results queue to drain 15794 1726882655.94294: waiting for pending results... 15794 1726882655.94485: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15794 1726882655.94570: in run() - task 0affe814-3a2d-94e5-e48f-000000000504 15794 1726882655.94585: variable 'ansible_search_path' from source: unknown 15794 1726882655.94589: variable 'ansible_search_path' from source: unknown 15794 1726882655.94620: calling self._execute() 15794 1726882655.94704: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882655.94710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882655.94720: variable 'omit' from source: magic vars 15794 1726882655.95052: variable 'ansible_distribution_major_version' from source: facts 15794 1726882655.95064: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882655.95167: variable 'profile_stat' from source: set_fact 15794 1726882655.95180: Evaluated conditional (profile_stat.stat.exists): False 15794 1726882655.95193: when evaluation is False, skipping this task 15794 1726882655.95196: _execute() done 15794 1726882655.95199: dumping result to json 15794 1726882655.95201: done dumping result, returning 15794 1726882655.95206: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-94e5-e48f-000000000504] 15794 1726882655.95214: sending task result for task 0affe814-3a2d-94e5-e48f-000000000504 15794 1726882655.95307: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000504 15794 1726882655.95310: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15794 1726882655.95364: no more pending results, returning what we have 15794 1726882655.95369: results queue empty 15794 1726882655.95370: checking for any_errors_fatal 15794 1726882655.95378: done checking for any_errors_fatal 15794 1726882655.95378: checking for max_fail_percentage 15794 1726882655.95380: done checking for max_fail_percentage 15794 1726882655.95381: checking to see if all hosts have failed and the running result is not ok 15794 1726882655.95382: done checking to see if all hosts have failed 15794 1726882655.95383: getting the remaining hosts for this loop 15794 1726882655.95385: done getting the remaining hosts for this loop 15794 1726882655.95389: getting the next task for host managed_node1 15794 1726882655.95395: done getting next task for host managed_node1 15794 1726882655.95397: ^ task is: TASK: Get NM profile info 15794 1726882655.95402: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882655.95405: getting variables 15794 1726882655.95407: in VariableManager get_vars() 15794 1726882655.95433: Calling all_inventory to load vars for managed_node1 15794 1726882655.95438: Calling groups_inventory to load vars for managed_node1 15794 1726882655.95442: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882655.95452: Calling all_plugins_play to load vars for managed_node1 15794 1726882655.95455: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882655.95458: Calling groups_plugins_play to load vars for managed_node1 15794 1726882655.99671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882656.01213: done with get_vars() 15794 1726882656.01236: done getting variables 15794 1726882656.01305: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:37:36 -0400 (0:00:00.072) 0:00:53.571 ****** 15794 1726882656.01325: entering _queue_task() for managed_node1/shell 15794 1726882656.01326: Creating lock for shell 15794 1726882656.01593: worker is 1 (out of 1 available) 15794 1726882656.01606: exiting _queue_task() for managed_node1/shell 15794 1726882656.01620: done queuing things up, now waiting for results queue to drain 15794 1726882656.01622: waiting for pending results... 15794 1726882656.01812: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15794 1726882656.01905: in run() - task 0affe814-3a2d-94e5-e48f-000000000505 15794 1726882656.01919: variable 'ansible_search_path' from source: unknown 15794 1726882656.01922: variable 'ansible_search_path' from source: unknown 15794 1726882656.01960: calling self._execute() 15794 1726882656.02041: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.02048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.02059: variable 'omit' from source: magic vars 15794 1726882656.02402: variable 'ansible_distribution_major_version' from source: facts 15794 1726882656.02413: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882656.02420: variable 'omit' from source: magic vars 15794 1726882656.02461: variable 'omit' from source: magic vars 15794 1726882656.02550: variable 'profile' from source: include params 15794 1726882656.02555: variable 'interface' from source: set_fact 15794 1726882656.02617: variable 'interface' from source: set_fact 15794 1726882656.02638: variable 'omit' from source: magic vars 15794 1726882656.02677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882656.02709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882656.02728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882656.02749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882656.02760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882656.02792: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882656.02795: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.02800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.02888: Set connection var ansible_connection to ssh 15794 1726882656.02896: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882656.02903: Set connection var ansible_pipelining to False 15794 1726882656.02910: Set connection var ansible_shell_executable to /bin/sh 15794 1726882656.02913: Set connection var ansible_shell_type to sh 15794 1726882656.02922: Set connection var ansible_timeout to 10 15794 1726882656.02960: variable 'ansible_shell_executable' from source: unknown 15794 1726882656.02963: variable 'ansible_connection' from source: unknown 15794 1726882656.02966: variable 'ansible_module_compression' from source: unknown 15794 1726882656.02969: variable 'ansible_shell_type' from source: unknown 15794 1726882656.02972: variable 'ansible_shell_executable' from source: unknown 15794 1726882656.02974: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.02976: variable 'ansible_pipelining' from source: unknown 15794 1726882656.02979: variable 'ansible_timeout' from source: unknown 15794 1726882656.02982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.03104: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882656.03114: variable 'omit' from source: magic vars 15794 1726882656.03120: starting attempt loop 15794 1726882656.03123: running the handler 15794 1726882656.03135: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882656.03151: _low_level_execute_command(): starting 15794 1726882656.03172: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882656.03714: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882656.03719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882656.03725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.03786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882656.03791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882656.03794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882656.03858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882656.05600: stdout chunk (state=3): >>>/root <<< 15794 1726882656.05712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882656.05766: stderr chunk (state=3): >>><<< 15794 1726882656.05770: stdout chunk (state=3): >>><<< 15794 1726882656.05797: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882656.05811: _low_level_execute_command(): starting 15794 1726882656.05818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369 `" && echo ansible-tmp-1726882656.0579717-17797-115966338256369="` echo /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369 `" ) && sleep 0' 15794 1726882656.06284: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882656.06295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882656.06298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882656.06301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882656.06304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.06340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882656.06363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882656.06415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882656.08377: stdout chunk (state=3): >>>ansible-tmp-1726882656.0579717-17797-115966338256369=/root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369 <<< 15794 1726882656.08498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882656.08548: stderr chunk (state=3): >>><<< 15794 1726882656.08551: stdout chunk (state=3): >>><<< 15794 1726882656.08567: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882656.0579717-17797-115966338256369=/root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882656.08595: variable 'ansible_module_compression' from source: unknown 15794 1726882656.08642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882656.08682: variable 'ansible_facts' from source: unknown 15794 1726882656.08733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py 15794 1726882656.08850: Sending initial data 15794 1726882656.08854: Sent initial data (156 bytes) 15794 1726882656.09289: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882656.09293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.09296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882656.09299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.09354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882656.09362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882656.09420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882656.11005: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15794 1726882656.11011: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882656.11059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882656.11117: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpw94ajqn1 /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py <<< 15794 1726882656.11120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py" <<< 15794 1726882656.11170: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpw94ajqn1" to remote "/root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py" <<< 15794 1726882656.12033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882656.12089: stderr chunk (state=3): >>><<< 15794 1726882656.12093: stdout chunk (state=3): >>><<< 15794 1726882656.12114: done transferring module to remote 15794 1726882656.12126: _low_level_execute_command(): starting 15794 1726882656.12131: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/ /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py && sleep 0' 15794 1726882656.12574: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882656.12577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.12583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882656.12588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.12639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882656.12644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882656.12707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882656.14514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882656.14560: stderr chunk (state=3): >>><<< 15794 1726882656.14564: stdout chunk (state=3): >>><<< 15794 1726882656.14581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882656.14585: _low_level_execute_command(): starting 15794 1726882656.14588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/AnsiballZ_command.py && sleep 0' 15794 1726882656.15107: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882656.15111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.15115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.15144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882656.15204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882656.34123: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:37:36.320776", "end": "2024-09-20 21:37:36.338294", "delta": "0:00:00.017518", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882656.35813: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.217 closed. <<< 15794 1726882656.35818: stdout chunk (state=3): >>><<< 15794 1726882656.35821: stderr chunk (state=3): >>><<< 15794 1726882656.35853: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:37:36.320776", "end": "2024-09-20 21:37:36.338294", "delta": "0:00:00.017518", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.217 closed. 15794 1726882656.36145: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882656.36150: _low_level_execute_command(): starting 15794 1726882656.36153: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882656.0579717-17797-115966338256369/ > /dev/null 2>&1 && sleep 0' 15794 1726882656.37454: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882656.37468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882656.37552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882656.37675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882656.37748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882656.37879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882656.38006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882656.40005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882656.40008: stdout chunk (state=3): >>><<< 15794 1726882656.40011: stderr chunk (state=3): >>><<< 15794 1726882656.40044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882656.40217: handler run complete 15794 1726882656.40220: Evaluated conditional (False): False 15794 1726882656.40223: attempt loop complete, returning result 15794 1726882656.40225: _execute() done 15794 1726882656.40227: dumping result to json 15794 1726882656.40232: done dumping result, returning 15794 1726882656.40238: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affe814-3a2d-94e5-e48f-000000000505] 15794 1726882656.40244: sending task result for task 0affe814-3a2d-94e5-e48f-000000000505 15794 1726882656.40324: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000505 15794 1726882656.40328: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.017518", "end": "2024-09-20 21:37:36.338294", "rc": 1, "start": "2024-09-20 21:37:36.320776" } MSG: non-zero return code ...ignoring 15794 1726882656.40441: no more pending results, returning what we have 15794 1726882656.40445: results queue empty 15794 1726882656.40449: checking for any_errors_fatal 15794 1726882656.40458: done checking for any_errors_fatal 15794 1726882656.40459: checking for max_fail_percentage 15794 1726882656.40462: done checking for max_fail_percentage 15794 1726882656.40463: checking to see if all hosts have failed and the running result is not ok 15794 1726882656.40464: done checking to see if all hosts have failed 15794 1726882656.40465: getting the remaining hosts for this loop 15794 1726882656.40467: done getting the remaining hosts for this loop 15794 1726882656.40472: getting the next task for host managed_node1 15794 1726882656.40481: done getting next task for host managed_node1 15794 1726882656.40485: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15794 1726882656.40491: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882656.40497: getting variables 15794 1726882656.40499: in VariableManager get_vars() 15794 1726882656.40721: Calling all_inventory to load vars for managed_node1 15794 1726882656.40725: Calling groups_inventory to load vars for managed_node1 15794 1726882656.40730: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882656.40746: Calling all_plugins_play to load vars for managed_node1 15794 1726882656.40750: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882656.40834: Calling groups_plugins_play to load vars for managed_node1 15794 1726882656.44703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882656.50794: done with get_vars() 15794 1726882656.50954: done getting variables 15794 1726882656.51027: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:37:36 -0400 (0:00:00.498) 0:00:54.069 ****** 15794 1726882656.51176: entering _queue_task() for managed_node1/set_fact 15794 1726882656.51977: worker is 1 (out of 1 available) 15794 1726882656.51991: exiting _queue_task() for managed_node1/set_fact 15794 1726882656.52005: done queuing things up, now waiting for results queue to drain 15794 1726882656.52006: waiting for pending results... 15794 1726882656.52952: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15794 1726882656.52957: in run() - task 0affe814-3a2d-94e5-e48f-000000000506 15794 1726882656.52961: variable 'ansible_search_path' from source: unknown 15794 1726882656.52963: variable 'ansible_search_path' from source: unknown 15794 1726882656.52967: calling self._execute() 15794 1726882656.53176: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.53196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.53214: variable 'omit' from source: magic vars 15794 1726882656.54070: variable 'ansible_distribution_major_version' from source: facts 15794 1726882656.54339: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882656.54429: variable 'nm_profile_exists' from source: set_fact 15794 1726882656.54739: Evaluated conditional (nm_profile_exists.rc == 0): False 15794 1726882656.54743: when evaluation is False, skipping this task 15794 1726882656.54746: _execute() done 15794 1726882656.54749: dumping result to json 15794 1726882656.54751: done dumping result, returning 15794 1726882656.54754: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-94e5-e48f-000000000506] 15794 1726882656.54756: sending task result for task 0affe814-3a2d-94e5-e48f-000000000506 15794 1726882656.54855: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000506 15794 1726882656.54860: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15794 1726882656.54919: no more pending results, returning what we have 15794 1726882656.54924: results queue empty 15794 1726882656.54925: checking for any_errors_fatal 15794 1726882656.54937: done checking for any_errors_fatal 15794 1726882656.54938: checking for max_fail_percentage 15794 1726882656.54940: done checking for max_fail_percentage 15794 1726882656.54941: checking to see if all hosts have failed and the running result is not ok 15794 1726882656.54942: done checking to see if all hosts have failed 15794 1726882656.54943: getting the remaining hosts for this loop 15794 1726882656.54945: done getting the remaining hosts for this loop 15794 1726882656.54949: getting the next task for host managed_node1 15794 1726882656.54961: done getting next task for host managed_node1 15794 1726882656.54964: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15794 1726882656.54969: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882656.54973: getting variables 15794 1726882656.54975: in VariableManager get_vars() 15794 1726882656.55008: Calling all_inventory to load vars for managed_node1 15794 1726882656.55011: Calling groups_inventory to load vars for managed_node1 15794 1726882656.55016: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882656.55028: Calling all_plugins_play to load vars for managed_node1 15794 1726882656.55032: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882656.55157: Calling groups_plugins_play to load vars for managed_node1 15794 1726882656.60622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882656.66803: done with get_vars() 15794 1726882656.66892: done getting variables 15794 1726882656.66969: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882656.67378: variable 'profile' from source: include params 15794 1726882656.67383: variable 'interface' from source: set_fact 15794 1726882656.67573: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:37:36 -0400 (0:00:00.164) 0:00:54.234 ****** 15794 1726882656.67611: entering _queue_task() for managed_node1/command 15794 1726882656.68411: worker is 1 (out of 1 available) 15794 1726882656.68425: exiting _queue_task() for managed_node1/command 15794 1726882656.68543: done queuing things up, now waiting for results queue to drain 15794 1726882656.68549: waiting for pending results... 15794 1726882656.68937: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 15794 1726882656.69440: in run() - task 0affe814-3a2d-94e5-e48f-000000000508 15794 1726882656.69445: variable 'ansible_search_path' from source: unknown 15794 1726882656.69448: variable 'ansible_search_path' from source: unknown 15794 1726882656.69450: calling self._execute() 15794 1726882656.69840: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.69844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.69847: variable 'omit' from source: magic vars 15794 1726882656.70508: variable 'ansible_distribution_major_version' from source: facts 15794 1726882656.70840: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882656.70922: variable 'profile_stat' from source: set_fact 15794 1726882656.70948: Evaluated conditional (profile_stat.stat.exists): False 15794 1726882656.70958: when evaluation is False, skipping this task 15794 1726882656.70967: _execute() done 15794 1726882656.70977: dumping result to json 15794 1726882656.70990: done dumping result, returning 15794 1726882656.71340: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0affe814-3a2d-94e5-e48f-000000000508] 15794 1726882656.71344: sending task result for task 0affe814-3a2d-94e5-e48f-000000000508 15794 1726882656.71426: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000508 15794 1726882656.71431: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15794 1726882656.71497: no more pending results, returning what we have 15794 1726882656.71502: results queue empty 15794 1726882656.71505: checking for any_errors_fatal 15794 1726882656.71515: done checking for any_errors_fatal 15794 1726882656.71516: checking for max_fail_percentage 15794 1726882656.71518: done checking for max_fail_percentage 15794 1726882656.71520: checking to see if all hosts have failed and the running result is not ok 15794 1726882656.71520: done checking to see if all hosts have failed 15794 1726882656.71521: getting the remaining hosts for this loop 15794 1726882656.71523: done getting the remaining hosts for this loop 15794 1726882656.71528: getting the next task for host managed_node1 15794 1726882656.71538: done getting next task for host managed_node1 15794 1726882656.71541: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15794 1726882656.71546: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882656.71550: getting variables 15794 1726882656.71553: in VariableManager get_vars() 15794 1726882656.71585: Calling all_inventory to load vars for managed_node1 15794 1726882656.71589: Calling groups_inventory to load vars for managed_node1 15794 1726882656.71593: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882656.71606: Calling all_plugins_play to load vars for managed_node1 15794 1726882656.71610: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882656.71614: Calling groups_plugins_play to load vars for managed_node1 15794 1726882656.77155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882656.83097: done with get_vars() 15794 1726882656.83346: done getting variables 15794 1726882656.83422: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882656.83765: variable 'profile' from source: include params 15794 1726882656.83769: variable 'interface' from source: set_fact 15794 1726882656.83850: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:37:36 -0400 (0:00:00.162) 0:00:54.396 ****** 15794 1726882656.83889: entering _queue_task() for managed_node1/set_fact 15794 1726882656.84681: worker is 1 (out of 1 available) 15794 1726882656.84695: exiting _queue_task() for managed_node1/set_fact 15794 1726882656.84707: done queuing things up, now waiting for results queue to drain 15794 1726882656.84709: waiting for pending results... 15794 1726882656.85233: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 15794 1726882656.85547: in run() - task 0affe814-3a2d-94e5-e48f-000000000509 15794 1726882656.85564: variable 'ansible_search_path' from source: unknown 15794 1726882656.85568: variable 'ansible_search_path' from source: unknown 15794 1726882656.85726: calling self._execute() 15794 1726882656.85946: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.85953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.85969: variable 'omit' from source: magic vars 15794 1726882656.86787: variable 'ansible_distribution_major_version' from source: facts 15794 1726882656.86803: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882656.87265: variable 'profile_stat' from source: set_fact 15794 1726882656.87285: Evaluated conditional (profile_stat.stat.exists): False 15794 1726882656.87291: when evaluation is False, skipping this task 15794 1726882656.87294: _execute() done 15794 1726882656.87297: dumping result to json 15794 1726882656.87301: done dumping result, returning 15794 1726882656.87339: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0affe814-3a2d-94e5-e48f-000000000509] 15794 1726882656.87342: sending task result for task 0affe814-3a2d-94e5-e48f-000000000509 15794 1726882656.87529: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000509 15794 1726882656.87533: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15794 1726882656.87592: no more pending results, returning what we have 15794 1726882656.87597: results queue empty 15794 1726882656.87599: checking for any_errors_fatal 15794 1726882656.87606: done checking for any_errors_fatal 15794 1726882656.87606: checking for max_fail_percentage 15794 1726882656.87608: done checking for max_fail_percentage 15794 1726882656.87609: checking to see if all hosts have failed and the running result is not ok 15794 1726882656.87610: done checking to see if all hosts have failed 15794 1726882656.87611: getting the remaining hosts for this loop 15794 1726882656.87613: done getting the remaining hosts for this loop 15794 1726882656.87617: getting the next task for host managed_node1 15794 1726882656.87628: done getting next task for host managed_node1 15794 1726882656.87631: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15794 1726882656.87638: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882656.87642: getting variables 15794 1726882656.87644: in VariableManager get_vars() 15794 1726882656.87676: Calling all_inventory to load vars for managed_node1 15794 1726882656.87682: Calling groups_inventory to load vars for managed_node1 15794 1726882656.87688: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882656.87704: Calling all_plugins_play to load vars for managed_node1 15794 1726882656.87707: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882656.87712: Calling groups_plugins_play to load vars for managed_node1 15794 1726882656.91320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882656.96233: done with get_vars() 15794 1726882656.96276: done getting variables 15794 1726882656.96360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882656.96544: variable 'profile' from source: include params 15794 1726882656.96549: variable 'interface' from source: set_fact 15794 1726882656.96625: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:37:36 -0400 (0:00:00.127) 0:00:54.524 ****** 15794 1726882656.96666: entering _queue_task() for managed_node1/command 15794 1726882656.96984: worker is 1 (out of 1 available) 15794 1726882656.97001: exiting _queue_task() for managed_node1/command 15794 1726882656.97014: done queuing things up, now waiting for results queue to drain 15794 1726882656.97016: waiting for pending results... 15794 1726882656.97342: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 15794 1726882656.97471: in run() - task 0affe814-3a2d-94e5-e48f-00000000050a 15794 1726882656.97486: variable 'ansible_search_path' from source: unknown 15794 1726882656.97492: variable 'ansible_search_path' from source: unknown 15794 1726882656.97542: calling self._execute() 15794 1726882656.97661: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882656.97761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882656.97765: variable 'omit' from source: magic vars 15794 1726882656.98151: variable 'ansible_distribution_major_version' from source: facts 15794 1726882656.98163: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882656.98299: variable 'profile_stat' from source: set_fact 15794 1726882656.98311: Evaluated conditional (profile_stat.stat.exists): False 15794 1726882656.98315: when evaluation is False, skipping this task 15794 1726882656.98318: _execute() done 15794 1726882656.98323: dumping result to json 15794 1726882656.98326: done dumping result, returning 15794 1726882656.98333: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 [0affe814-3a2d-94e5-e48f-00000000050a] 15794 1726882656.98342: sending task result for task 0affe814-3a2d-94e5-e48f-00000000050a 15794 1726882656.98439: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000050a 15794 1726882656.98443: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15794 1726882656.98526: no more pending results, returning what we have 15794 1726882656.98530: results queue empty 15794 1726882656.98532: checking for any_errors_fatal 15794 1726882656.98543: done checking for any_errors_fatal 15794 1726882656.98544: checking for max_fail_percentage 15794 1726882656.98546: done checking for max_fail_percentage 15794 1726882656.98547: checking to see if all hosts have failed and the running result is not ok 15794 1726882656.98548: done checking to see if all hosts have failed 15794 1726882656.98549: getting the remaining hosts for this loop 15794 1726882656.98551: done getting the remaining hosts for this loop 15794 1726882656.98560: getting the next task for host managed_node1 15794 1726882656.98571: done getting next task for host managed_node1 15794 1726882656.98575: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15794 1726882656.98579: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882656.98583: getting variables 15794 1726882656.98585: in VariableManager get_vars() 15794 1726882656.98649: Calling all_inventory to load vars for managed_node1 15794 1726882656.98653: Calling groups_inventory to load vars for managed_node1 15794 1726882656.98657: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882656.98668: Calling all_plugins_play to load vars for managed_node1 15794 1726882656.98671: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882656.98674: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.00691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.03182: done with get_vars() 15794 1726882657.03222: done getting variables 15794 1726882657.03299: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882657.03438: variable 'profile' from source: include params 15794 1726882657.03442: variable 'interface' from source: set_fact 15794 1726882657.03517: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:37:37 -0400 (0:00:00.068) 0:00:54.593 ****** 15794 1726882657.03556: entering _queue_task() for managed_node1/set_fact 15794 1726882657.04141: worker is 1 (out of 1 available) 15794 1726882657.04154: exiting _queue_task() for managed_node1/set_fact 15794 1726882657.04168: done queuing things up, now waiting for results queue to drain 15794 1726882657.04170: waiting for pending results... 15794 1726882657.04476: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 15794 1726882657.04517: in run() - task 0affe814-3a2d-94e5-e48f-00000000050b 15794 1726882657.04539: variable 'ansible_search_path' from source: unknown 15794 1726882657.04543: variable 'ansible_search_path' from source: unknown 15794 1726882657.04648: calling self._execute() 15794 1726882657.04760: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.04765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.04768: variable 'omit' from source: magic vars 15794 1726882657.05236: variable 'ansible_distribution_major_version' from source: facts 15794 1726882657.05242: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882657.05466: variable 'profile_stat' from source: set_fact 15794 1726882657.05473: Evaluated conditional (profile_stat.stat.exists): False 15794 1726882657.05476: when evaluation is False, skipping this task 15794 1726882657.05478: _execute() done 15794 1726882657.05481: dumping result to json 15794 1726882657.05485: done dumping result, returning 15794 1726882657.05488: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0affe814-3a2d-94e5-e48f-00000000050b] 15794 1726882657.05490: sending task result for task 0affe814-3a2d-94e5-e48f-00000000050b 15794 1726882657.05637: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000050b 15794 1726882657.05641: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15794 1726882657.05713: no more pending results, returning what we have 15794 1726882657.05718: results queue empty 15794 1726882657.05719: checking for any_errors_fatal 15794 1726882657.05727: done checking for any_errors_fatal 15794 1726882657.05728: checking for max_fail_percentage 15794 1726882657.05732: done checking for max_fail_percentage 15794 1726882657.05737: checking to see if all hosts have failed and the running result is not ok 15794 1726882657.05738: done checking to see if all hosts have failed 15794 1726882657.05739: getting the remaining hosts for this loop 15794 1726882657.05743: done getting the remaining hosts for this loop 15794 1726882657.05749: getting the next task for host managed_node1 15794 1726882657.05760: done getting next task for host managed_node1 15794 1726882657.05768: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15794 1726882657.05772: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882657.05784: getting variables 15794 1726882657.05789: in VariableManager get_vars() 15794 1726882657.05828: Calling all_inventory to load vars for managed_node1 15794 1726882657.05832: Calling groups_inventory to load vars for managed_node1 15794 1726882657.06044: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.06060: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.06068: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.06073: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.08594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.11459: done with get_vars() 15794 1726882657.11497: done getting variables 15794 1726882657.11572: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882657.11708: variable 'profile' from source: include params 15794 1726882657.11713: variable 'interface' from source: set_fact 15794 1726882657.11789: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:37:37 -0400 (0:00:00.082) 0:00:54.676 ****** 15794 1726882657.11824: entering _queue_task() for managed_node1/assert 15794 1726882657.12192: worker is 1 (out of 1 available) 15794 1726882657.12206: exiting _queue_task() for managed_node1/assert 15794 1726882657.12219: done queuing things up, now waiting for results queue to drain 15794 1726882657.12220: waiting for pending results... 15794 1726882657.12776: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' 15794 1726882657.12789: in run() - task 0affe814-3a2d-94e5-e48f-0000000004f6 15794 1726882657.12793: variable 'ansible_search_path' from source: unknown 15794 1726882657.12797: variable 'ansible_search_path' from source: unknown 15794 1726882657.12800: calling self._execute() 15794 1726882657.13061: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.13066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.13070: variable 'omit' from source: magic vars 15794 1726882657.13511: variable 'ansible_distribution_major_version' from source: facts 15794 1726882657.13517: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882657.13521: variable 'omit' from source: magic vars 15794 1726882657.13525: variable 'omit' from source: magic vars 15794 1726882657.13690: variable 'profile' from source: include params 15794 1726882657.13695: variable 'interface' from source: set_fact 15794 1726882657.13824: variable 'interface' from source: set_fact 15794 1726882657.13828: variable 'omit' from source: magic vars 15794 1726882657.13855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882657.13908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882657.13941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882657.14017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882657.14021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882657.14024: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882657.14027: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.14029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.14174: Set connection var ansible_connection to ssh 15794 1726882657.14242: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882657.14245: Set connection var ansible_pipelining to False 15794 1726882657.14247: Set connection var ansible_shell_executable to /bin/sh 15794 1726882657.14249: Set connection var ansible_shell_type to sh 15794 1726882657.14252: Set connection var ansible_timeout to 10 15794 1726882657.14269: variable 'ansible_shell_executable' from source: unknown 15794 1726882657.14277: variable 'ansible_connection' from source: unknown 15794 1726882657.14285: variable 'ansible_module_compression' from source: unknown 15794 1726882657.14292: variable 'ansible_shell_type' from source: unknown 15794 1726882657.14299: variable 'ansible_shell_executable' from source: unknown 15794 1726882657.14312: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.14325: variable 'ansible_pipelining' from source: unknown 15794 1726882657.14335: variable 'ansible_timeout' from source: unknown 15794 1726882657.14366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.14552: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882657.14583: variable 'omit' from source: magic vars 15794 1726882657.14640: starting attempt loop 15794 1726882657.14648: running the handler 15794 1726882657.14786: variable 'lsr_net_profile_exists' from source: set_fact 15794 1726882657.14805: Evaluated conditional (not lsr_net_profile_exists): True 15794 1726882657.14821: handler run complete 15794 1726882657.14903: attempt loop complete, returning result 15794 1726882657.14908: _execute() done 15794 1726882657.14910: dumping result to json 15794 1726882657.14913: done dumping result, returning 15794 1726882657.14915: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' [0affe814-3a2d-94e5-e48f-0000000004f6] 15794 1726882657.14919: sending task result for task 0affe814-3a2d-94e5-e48f-0000000004f6 15794 1726882657.15110: done sending task result for task 0affe814-3a2d-94e5-e48f-0000000004f6 15794 1726882657.15113: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15794 1726882657.15175: no more pending results, returning what we have 15794 1726882657.15179: results queue empty 15794 1726882657.15180: checking for any_errors_fatal 15794 1726882657.15195: done checking for any_errors_fatal 15794 1726882657.15196: checking for max_fail_percentage 15794 1726882657.15201: done checking for max_fail_percentage 15794 1726882657.15203: checking to see if all hosts have failed and the running result is not ok 15794 1726882657.15204: done checking to see if all hosts have failed 15794 1726882657.15205: getting the remaining hosts for this loop 15794 1726882657.15207: done getting the remaining hosts for this loop 15794 1726882657.15216: getting the next task for host managed_node1 15794 1726882657.15227: done getting next task for host managed_node1 15794 1726882657.15237: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15794 1726882657.15244: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882657.15250: getting variables 15794 1726882657.15252: in VariableManager get_vars() 15794 1726882657.15297: Calling all_inventory to load vars for managed_node1 15794 1726882657.15303: Calling groups_inventory to load vars for managed_node1 15794 1726882657.15308: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.15322: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.15327: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.15332: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.18075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.21353: done with get_vars() 15794 1726882657.21391: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 21:37:37 -0400 (0:00:00.097) 0:00:54.773 ****** 15794 1726882657.21548: entering _queue_task() for managed_node1/include_tasks 15794 1726882657.22019: worker is 1 (out of 1 available) 15794 1726882657.22033: exiting _queue_task() for managed_node1/include_tasks 15794 1726882657.22251: done queuing things up, now waiting for results queue to drain 15794 1726882657.22253: waiting for pending results... 15794 1726882657.22574: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 15794 1726882657.22581: in run() - task 0affe814-3a2d-94e5-e48f-000000000075 15794 1726882657.22605: variable 'ansible_search_path' from source: unknown 15794 1726882657.22653: calling self._execute() 15794 1726882657.22788: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.22807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.22826: variable 'omit' from source: magic vars 15794 1726882657.23400: variable 'ansible_distribution_major_version' from source: facts 15794 1726882657.23441: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882657.23454: _execute() done 15794 1726882657.23543: dumping result to json 15794 1726882657.23555: done dumping result, returning 15794 1726882657.23559: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0affe814-3a2d-94e5-e48f-000000000075] 15794 1726882657.23562: sending task result for task 0affe814-3a2d-94e5-e48f-000000000075 15794 1726882657.23643: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000075 15794 1726882657.23646: WORKER PROCESS EXITING 15794 1726882657.23683: no more pending results, returning what we have 15794 1726882657.23690: in VariableManager get_vars() 15794 1726882657.23727: Calling all_inventory to load vars for managed_node1 15794 1726882657.23731: Calling groups_inventory to load vars for managed_node1 15794 1726882657.23737: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.23754: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.23758: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.23763: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.26755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.30342: done with get_vars() 15794 1726882657.30379: variable 'ansible_search_path' from source: unknown 15794 1726882657.30397: we have included files to process 15794 1726882657.30398: generating all_blocks data 15794 1726882657.30400: done generating all_blocks data 15794 1726882657.30406: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15794 1726882657.30408: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15794 1726882657.30411: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15794 1726882657.30680: in VariableManager get_vars() 15794 1726882657.30703: done with get_vars() 15794 1726882657.30944: done processing included file 15794 1726882657.30947: iterating over new_blocks loaded from include file 15794 1726882657.30949: in VariableManager get_vars() 15794 1726882657.30963: done with get_vars() 15794 1726882657.30975: filtering new block on tags 15794 1726882657.30998: done filtering new block on tags 15794 1726882657.31001: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15794 1726882657.31007: extending task lists for all hosts with included blocks 15794 1726882657.31284: done extending task lists 15794 1726882657.31286: done processing included files 15794 1726882657.31287: results queue empty 15794 1726882657.31288: checking for any_errors_fatal 15794 1726882657.31298: done checking for any_errors_fatal 15794 1726882657.31299: checking for max_fail_percentage 15794 1726882657.31300: done checking for max_fail_percentage 15794 1726882657.31302: checking to see if all hosts have failed and the running result is not ok 15794 1726882657.31303: done checking to see if all hosts have failed 15794 1726882657.31304: getting the remaining hosts for this loop 15794 1726882657.31305: done getting the remaining hosts for this loop 15794 1726882657.31309: getting the next task for host managed_node1 15794 1726882657.31313: done getting next task for host managed_node1 15794 1726882657.31316: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15794 1726882657.31319: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882657.31322: getting variables 15794 1726882657.31323: in VariableManager get_vars() 15794 1726882657.31333: Calling all_inventory to load vars for managed_node1 15794 1726882657.31338: Calling groups_inventory to load vars for managed_node1 15794 1726882657.31341: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.31360: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.31371: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.31376: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.33768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.37720: done with get_vars() 15794 1726882657.37762: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:37:37 -0400 (0:00:00.163) 0:00:54.936 ****** 15794 1726882657.37907: entering _queue_task() for managed_node1/include_tasks 15794 1726882657.38760: worker is 1 (out of 1 available) 15794 1726882657.38770: exiting _queue_task() for managed_node1/include_tasks 15794 1726882657.38784: done queuing things up, now waiting for results queue to drain 15794 1726882657.38785: waiting for pending results... 15794 1726882657.39067: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15794 1726882657.39227: in run() - task 0affe814-3a2d-94e5-e48f-00000000053c 15794 1726882657.39245: variable 'ansible_search_path' from source: unknown 15794 1726882657.39249: variable 'ansible_search_path' from source: unknown 15794 1726882657.39375: calling self._execute() 15794 1726882657.39606: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.39613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.39628: variable 'omit' from source: magic vars 15794 1726882657.40440: variable 'ansible_distribution_major_version' from source: facts 15794 1726882657.40816: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882657.40823: _execute() done 15794 1726882657.40827: dumping result to json 15794 1726882657.40832: done dumping result, returning 15794 1726882657.40842: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-94e5-e48f-00000000053c] 15794 1726882657.40852: sending task result for task 0affe814-3a2d-94e5-e48f-00000000053c 15794 1726882657.40999: no more pending results, returning what we have 15794 1726882657.41007: in VariableManager get_vars() 15794 1726882657.41057: Calling all_inventory to load vars for managed_node1 15794 1726882657.41061: Calling groups_inventory to load vars for managed_node1 15794 1726882657.41065: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.41078: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.41084: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.41087: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.41801: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000053c 15794 1726882657.41806: WORKER PROCESS EXITING 15794 1726882657.44874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.47247: done with get_vars() 15794 1726882657.47268: variable 'ansible_search_path' from source: unknown 15794 1726882657.47269: variable 'ansible_search_path' from source: unknown 15794 1726882657.47304: we have included files to process 15794 1726882657.47305: generating all_blocks data 15794 1726882657.47306: done generating all_blocks data 15794 1726882657.47307: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15794 1726882657.47308: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15794 1726882657.47310: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15794 1726882657.47466: done processing included file 15794 1726882657.47468: iterating over new_blocks loaded from include file 15794 1726882657.47469: in VariableManager get_vars() 15794 1726882657.47482: done with get_vars() 15794 1726882657.47483: filtering new block on tags 15794 1726882657.47496: done filtering new block on tags 15794 1726882657.47498: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15794 1726882657.47502: extending task lists for all hosts with included blocks 15794 1726882657.47587: done extending task lists 15794 1726882657.47588: done processing included files 15794 1726882657.47589: results queue empty 15794 1726882657.47589: checking for any_errors_fatal 15794 1726882657.47592: done checking for any_errors_fatal 15794 1726882657.47592: checking for max_fail_percentage 15794 1726882657.47593: done checking for max_fail_percentage 15794 1726882657.47594: checking to see if all hosts have failed and the running result is not ok 15794 1726882657.47594: done checking to see if all hosts have failed 15794 1726882657.47595: getting the remaining hosts for this loop 15794 1726882657.47596: done getting the remaining hosts for this loop 15794 1726882657.47598: getting the next task for host managed_node1 15794 1726882657.47601: done getting next task for host managed_node1 15794 1726882657.47603: ^ task is: TASK: Get stat for interface {{ interface }} 15794 1726882657.47606: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882657.47607: getting variables 15794 1726882657.47608: in VariableManager get_vars() 15794 1726882657.47615: Calling all_inventory to load vars for managed_node1 15794 1726882657.47617: Calling groups_inventory to load vars for managed_node1 15794 1726882657.47619: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.47625: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.47627: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.47629: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.49088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.51554: done with get_vars() 15794 1726882657.51576: done getting variables 15794 1726882657.51711: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:37:37 -0400 (0:00:00.138) 0:00:55.075 ****** 15794 1726882657.51740: entering _queue_task() for managed_node1/stat 15794 1726882657.52010: worker is 1 (out of 1 available) 15794 1726882657.52023: exiting _queue_task() for managed_node1/stat 15794 1726882657.52036: done queuing things up, now waiting for results queue to drain 15794 1726882657.52038: waiting for pending results... 15794 1726882657.52230: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 15794 1726882657.52330: in run() - task 0affe814-3a2d-94e5-e48f-000000000554 15794 1726882657.52344: variable 'ansible_search_path' from source: unknown 15794 1726882657.52347: variable 'ansible_search_path' from source: unknown 15794 1726882657.52384: calling self._execute() 15794 1726882657.52462: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.52467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.52482: variable 'omit' from source: magic vars 15794 1726882657.52797: variable 'ansible_distribution_major_version' from source: facts 15794 1726882657.52807: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882657.52819: variable 'omit' from source: magic vars 15794 1726882657.52859: variable 'omit' from source: magic vars 15794 1726882657.52943: variable 'interface' from source: set_fact 15794 1726882657.52957: variable 'omit' from source: magic vars 15794 1726882657.52993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882657.53023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882657.53046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882657.53063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882657.53073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882657.53102: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882657.53105: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.53110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.53196: Set connection var ansible_connection to ssh 15794 1726882657.53230: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882657.53236: Set connection var ansible_pipelining to False 15794 1726882657.53240: Set connection var ansible_shell_executable to /bin/sh 15794 1726882657.53242: Set connection var ansible_shell_type to sh 15794 1726882657.53327: Set connection var ansible_timeout to 10 15794 1726882657.53330: variable 'ansible_shell_executable' from source: unknown 15794 1726882657.53339: variable 'ansible_connection' from source: unknown 15794 1726882657.53346: variable 'ansible_module_compression' from source: unknown 15794 1726882657.53348: variable 'ansible_shell_type' from source: unknown 15794 1726882657.53351: variable 'ansible_shell_executable' from source: unknown 15794 1726882657.53353: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.53355: variable 'ansible_pipelining' from source: unknown 15794 1726882657.53358: variable 'ansible_timeout' from source: unknown 15794 1726882657.53360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.53740: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15794 1726882657.53744: variable 'omit' from source: magic vars 15794 1726882657.53747: starting attempt loop 15794 1726882657.53749: running the handler 15794 1726882657.53752: _low_level_execute_command(): starting 15794 1726882657.53754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882657.54395: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882657.54417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.54467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882657.54489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882657.54549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882657.56311: stdout chunk (state=3): >>>/root <<< 15794 1726882657.56425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882657.56483: stderr chunk (state=3): >>><<< 15794 1726882657.56485: stdout chunk (state=3): >>><<< 15794 1726882657.56541: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882657.56544: _low_level_execute_command(): starting 15794 1726882657.56548: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100 `" && echo ansible-tmp-1726882657.565031-17845-123757223333100="` echo /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100 `" ) && sleep 0' 15794 1726882657.56932: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882657.56968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 15794 1726882657.56972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 15794 1726882657.56974: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882657.56983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.57022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882657.57037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882657.57106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882657.59074: stdout chunk (state=3): >>>ansible-tmp-1726882657.565031-17845-123757223333100=/root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100 <<< 15794 1726882657.59205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882657.59246: stderr chunk (state=3): >>><<< 15794 1726882657.59250: stdout chunk (state=3): >>><<< 15794 1726882657.59265: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882657.565031-17845-123757223333100=/root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882657.59309: variable 'ansible_module_compression' from source: unknown 15794 1726882657.59354: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15794 1726882657.59390: variable 'ansible_facts' from source: unknown 15794 1726882657.59444: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py 15794 1726882657.59553: Sending initial data 15794 1726882657.59557: Sent initial data (152 bytes) 15794 1726882657.60011: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882657.60014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882657.60017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.60019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882657.60022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.60082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882657.60085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882657.60141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882657.61722: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882657.61778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882657.61835: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp7eyka095 /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py <<< 15794 1726882657.61839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py" <<< 15794 1726882657.61889: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmp7eyka095" to remote "/root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py" <<< 15794 1726882657.62748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882657.62807: stderr chunk (state=3): >>><<< 15794 1726882657.62810: stdout chunk (state=3): >>><<< 15794 1726882657.62828: done transferring module to remote 15794 1726882657.62843: _low_level_execute_command(): starting 15794 1726882657.62849: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/ /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py && sleep 0' 15794 1726882657.63297: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882657.63300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.63303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882657.63305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.63355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882657.63358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882657.63423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882657.65250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882657.65296: stderr chunk (state=3): >>><<< 15794 1726882657.65299: stdout chunk (state=3): >>><<< 15794 1726882657.65314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882657.65317: _low_level_execute_command(): starting 15794 1726882657.65323: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/AnsiballZ_stat.py && sleep 0' 15794 1726882657.65736: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882657.65776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882657.65779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.65782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 15794 1726882657.65784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 15794 1726882657.65787: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.65837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882657.65840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882657.65911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882657.82822: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15794 1726882657.84228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882657.84232: stdout chunk (state=3): >>><<< 15794 1726882657.84236: stderr chunk (state=3): >>><<< 15794 1726882657.84255: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882657.84331: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882657.84337: _low_level_execute_command(): starting 15794 1726882657.84340: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882657.565031-17845-123757223333100/ > /dev/null 2>&1 && sleep 0' 15794 1726882657.85048: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882657.85064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882657.85082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882657.85111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882657.85150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882657.85164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882657.85259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882657.85276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882657.85329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882657.85390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882657.87442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882657.87450: stdout chunk (state=3): >>><<< 15794 1726882657.87452: stderr chunk (state=3): >>><<< 15794 1726882657.87455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882657.87457: handler run complete 15794 1726882657.87491: attempt loop complete, returning result 15794 1726882657.87540: _execute() done 15794 1726882657.87543: dumping result to json 15794 1726882657.87546: done dumping result, returning 15794 1726882657.87548: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [0affe814-3a2d-94e5-e48f-000000000554] 15794 1726882657.87550: sending task result for task 0affe814-3a2d-94e5-e48f-000000000554 15794 1726882657.87866: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000554 15794 1726882657.87869: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15794 1726882657.87957: no more pending results, returning what we have 15794 1726882657.87962: results queue empty 15794 1726882657.87963: checking for any_errors_fatal 15794 1726882657.87966: done checking for any_errors_fatal 15794 1726882657.87966: checking for max_fail_percentage 15794 1726882657.87969: done checking for max_fail_percentage 15794 1726882657.87970: checking to see if all hosts have failed and the running result is not ok 15794 1726882657.87971: done checking to see if all hosts have failed 15794 1726882657.87971: getting the remaining hosts for this loop 15794 1726882657.87973: done getting the remaining hosts for this loop 15794 1726882657.87985: getting the next task for host managed_node1 15794 1726882657.87996: done getting next task for host managed_node1 15794 1726882657.87999: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15794 1726882657.88003: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882657.88007: getting variables 15794 1726882657.88009: in VariableManager get_vars() 15794 1726882657.88105: Calling all_inventory to load vars for managed_node1 15794 1726882657.88109: Calling groups_inventory to load vars for managed_node1 15794 1726882657.88114: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882657.88126: Calling all_plugins_play to load vars for managed_node1 15794 1726882657.88130: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882657.88136: Calling groups_plugins_play to load vars for managed_node1 15794 1726882657.90625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882657.94786: done with get_vars() 15794 1726882657.94824: done getting variables 15794 1726882657.94902: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15794 1726882657.95045: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:37:37 -0400 (0:00:00.433) 0:00:55.508 ****** 15794 1726882657.95083: entering _queue_task() for managed_node1/assert 15794 1726882657.96055: worker is 1 (out of 1 available) 15794 1726882657.96141: exiting _queue_task() for managed_node1/assert 15794 1726882657.96155: done queuing things up, now waiting for results queue to drain 15794 1726882657.96157: waiting for pending results... 15794 1726882657.96628: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' 15794 1726882657.96729: in run() - task 0affe814-3a2d-94e5-e48f-00000000053d 15794 1726882657.96948: variable 'ansible_search_path' from source: unknown 15794 1726882657.96952: variable 'ansible_search_path' from source: unknown 15794 1726882657.96993: calling self._execute() 15794 1726882657.97239: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.97243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.97247: variable 'omit' from source: magic vars 15794 1726882657.97933: variable 'ansible_distribution_major_version' from source: facts 15794 1726882657.98150: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882657.98157: variable 'omit' from source: magic vars 15794 1726882657.98208: variable 'omit' from source: magic vars 15794 1726882657.98324: variable 'interface' from source: set_fact 15794 1726882657.98549: variable 'omit' from source: magic vars 15794 1726882657.98594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882657.98636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882657.98660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882657.98683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882657.98695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882657.98732: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882657.98737: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.98996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.99062: Set connection var ansible_connection to ssh 15794 1726882657.99071: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882657.99081: Set connection var ansible_pipelining to False 15794 1726882657.99087: Set connection var ansible_shell_executable to /bin/sh 15794 1726882657.99091: Set connection var ansible_shell_type to sh 15794 1726882657.99103: Set connection var ansible_timeout to 10 15794 1726882657.99135: variable 'ansible_shell_executable' from source: unknown 15794 1726882657.99342: variable 'ansible_connection' from source: unknown 15794 1726882657.99346: variable 'ansible_module_compression' from source: unknown 15794 1726882657.99351: variable 'ansible_shell_type' from source: unknown 15794 1726882657.99354: variable 'ansible_shell_executable' from source: unknown 15794 1726882657.99359: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882657.99365: variable 'ansible_pipelining' from source: unknown 15794 1726882657.99368: variable 'ansible_timeout' from source: unknown 15794 1726882657.99374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882657.99536: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882657.99751: variable 'omit' from source: magic vars 15794 1726882657.99757: starting attempt loop 15794 1726882657.99760: running the handler 15794 1726882657.99937: variable 'interface_stat' from source: set_fact 15794 1726882658.00153: Evaluated conditional (not interface_stat.stat.exists): True 15794 1726882658.00159: handler run complete 15794 1726882658.00182: attempt loop complete, returning result 15794 1726882658.00185: _execute() done 15794 1726882658.00188: dumping result to json 15794 1726882658.00191: done dumping result, returning 15794 1726882658.00196: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' [0affe814-3a2d-94e5-e48f-00000000053d] 15794 1726882658.00204: sending task result for task 0affe814-3a2d-94e5-e48f-00000000053d 15794 1726882658.00539: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000053d 15794 1726882658.00542: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15794 1726882658.00597: no more pending results, returning what we have 15794 1726882658.00601: results queue empty 15794 1726882658.00602: checking for any_errors_fatal 15794 1726882658.00611: done checking for any_errors_fatal 15794 1726882658.00612: checking for max_fail_percentage 15794 1726882658.00614: done checking for max_fail_percentage 15794 1726882658.00615: checking to see if all hosts have failed and the running result is not ok 15794 1726882658.00616: done checking to see if all hosts have failed 15794 1726882658.00617: getting the remaining hosts for this loop 15794 1726882658.00619: done getting the remaining hosts for this loop 15794 1726882658.00623: getting the next task for host managed_node1 15794 1726882658.00630: done getting next task for host managed_node1 15794 1726882658.00632: ^ task is: TASK: meta (flush_handlers) 15794 1726882658.00636: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882658.00640: getting variables 15794 1726882658.00642: in VariableManager get_vars() 15794 1726882658.00669: Calling all_inventory to load vars for managed_node1 15794 1726882658.00672: Calling groups_inventory to load vars for managed_node1 15794 1726882658.00675: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882658.00688: Calling all_plugins_play to load vars for managed_node1 15794 1726882658.00691: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882658.00695: Calling groups_plugins_play to load vars for managed_node1 15794 1726882658.05251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882658.08328: done with get_vars() 15794 1726882658.08369: done getting variables 15794 1726882658.08461: in VariableManager get_vars() 15794 1726882658.08474: Calling all_inventory to load vars for managed_node1 15794 1726882658.08477: Calling groups_inventory to load vars for managed_node1 15794 1726882658.08489: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882658.08495: Calling all_plugins_play to load vars for managed_node1 15794 1726882658.08499: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882658.08502: Calling groups_plugins_play to load vars for managed_node1 15794 1726882658.10660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882658.16212: done with get_vars() 15794 1726882658.16468: done queuing things up, now waiting for results queue to drain 15794 1726882658.16471: results queue empty 15794 1726882658.16472: checking for any_errors_fatal 15794 1726882658.16476: done checking for any_errors_fatal 15794 1726882658.16477: checking for max_fail_percentage 15794 1726882658.16481: done checking for max_fail_percentage 15794 1726882658.16482: checking to see if all hosts have failed and the running result is not ok 15794 1726882658.16483: done checking to see if all hosts have failed 15794 1726882658.16490: getting the remaining hosts for this loop 15794 1726882658.16492: done getting the remaining hosts for this loop 15794 1726882658.16495: getting the next task for host managed_node1 15794 1726882658.16500: done getting next task for host managed_node1 15794 1726882658.16502: ^ task is: TASK: meta (flush_handlers) 15794 1726882658.16504: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882658.16508: getting variables 15794 1726882658.16509: in VariableManager get_vars() 15794 1726882658.16520: Calling all_inventory to load vars for managed_node1 15794 1726882658.16523: Calling groups_inventory to load vars for managed_node1 15794 1726882658.16526: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882658.16532: Calling all_plugins_play to load vars for managed_node1 15794 1726882658.16537: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882658.16541: Calling groups_plugins_play to load vars for managed_node1 15794 1726882658.20851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882658.25358: done with get_vars() 15794 1726882658.25404: done getting variables 15794 1726882658.25472: in VariableManager get_vars() 15794 1726882658.25491: Calling all_inventory to load vars for managed_node1 15794 1726882658.25494: Calling groups_inventory to load vars for managed_node1 15794 1726882658.25498: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882658.25504: Calling all_plugins_play to load vars for managed_node1 15794 1726882658.25507: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882658.25511: Calling groups_plugins_play to load vars for managed_node1 15794 1726882658.27787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882658.40476: done with get_vars() 15794 1726882658.40522: done queuing things up, now waiting for results queue to drain 15794 1726882658.40525: results queue empty 15794 1726882658.40526: checking for any_errors_fatal 15794 1726882658.40527: done checking for any_errors_fatal 15794 1726882658.40528: checking for max_fail_percentage 15794 1726882658.40530: done checking for max_fail_percentage 15794 1726882658.40531: checking to see if all hosts have failed and the running result is not ok 15794 1726882658.40532: done checking to see if all hosts have failed 15794 1726882658.40533: getting the remaining hosts for this loop 15794 1726882658.40536: done getting the remaining hosts for this loop 15794 1726882658.40539: getting the next task for host managed_node1 15794 1726882658.40543: done getting next task for host managed_node1 15794 1726882658.40545: ^ task is: None 15794 1726882658.40546: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882658.40548: done queuing things up, now waiting for results queue to drain 15794 1726882658.40549: results queue empty 15794 1726882658.40550: checking for any_errors_fatal 15794 1726882658.40551: done checking for any_errors_fatal 15794 1726882658.40552: checking for max_fail_percentage 15794 1726882658.40553: done checking for max_fail_percentage 15794 1726882658.40554: checking to see if all hosts have failed and the running result is not ok 15794 1726882658.40555: done checking to see if all hosts have failed 15794 1726882658.40556: getting the next task for host managed_node1 15794 1726882658.40559: done getting next task for host managed_node1 15794 1726882658.40560: ^ task is: None 15794 1726882658.40562: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882658.40611: in VariableManager get_vars() 15794 1726882658.40629: done with get_vars() 15794 1726882658.40638: in VariableManager get_vars() 15794 1726882658.40649: done with get_vars() 15794 1726882658.40654: variable 'omit' from source: magic vars 15794 1726882658.40691: in VariableManager get_vars() 15794 1726882658.40703: done with get_vars() 15794 1726882658.40727: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15794 1726882658.41010: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15794 1726882658.41036: getting the remaining hosts for this loop 15794 1726882658.41037: done getting the remaining hosts for this loop 15794 1726882658.41041: getting the next task for host managed_node1 15794 1726882658.41044: done getting next task for host managed_node1 15794 1726882658.41046: ^ task is: TASK: Gathering Facts 15794 1726882658.41048: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882658.41050: getting variables 15794 1726882658.41052: in VariableManager get_vars() 15794 1726882658.41062: Calling all_inventory to load vars for managed_node1 15794 1726882658.41065: Calling groups_inventory to load vars for managed_node1 15794 1726882658.41068: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882658.41074: Calling all_plugins_play to load vars for managed_node1 15794 1726882658.41077: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882658.41084: Calling groups_plugins_play to load vars for managed_node1 15794 1726882658.43483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882658.49804: done with get_vars() 15794 1726882658.49845: done getting variables 15794 1726882658.49908: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 21:37:38 -0400 (0:00:00.548) 0:00:56.057 ****** 15794 1726882658.49939: entering _queue_task() for managed_node1/gather_facts 15794 1726882658.50293: worker is 1 (out of 1 available) 15794 1726882658.50306: exiting _queue_task() for managed_node1/gather_facts 15794 1726882658.50318: done queuing things up, now waiting for results queue to drain 15794 1726882658.50320: waiting for pending results... 15794 1726882658.50752: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15794 1726882658.50768: in run() - task 0affe814-3a2d-94e5-e48f-00000000056d 15794 1726882658.50795: variable 'ansible_search_path' from source: unknown 15794 1726882658.50843: calling self._execute() 15794 1726882658.51107: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882658.51126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882658.51146: variable 'omit' from source: magic vars 15794 1726882658.51863: variable 'ansible_distribution_major_version' from source: facts 15794 1726882658.52141: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882658.52146: variable 'omit' from source: magic vars 15794 1726882658.52148: variable 'omit' from source: magic vars 15794 1726882658.52163: variable 'omit' from source: magic vars 15794 1726882658.52220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882658.52295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882658.52392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882658.52498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882658.52518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882658.52562: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882658.52797: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882658.52801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882658.53015: Set connection var ansible_connection to ssh 15794 1726882658.53019: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882658.53022: Set connection var ansible_pipelining to False 15794 1726882658.53024: Set connection var ansible_shell_executable to /bin/sh 15794 1726882658.53026: Set connection var ansible_shell_type to sh 15794 1726882658.53029: Set connection var ansible_timeout to 10 15794 1726882658.53138: variable 'ansible_shell_executable' from source: unknown 15794 1726882658.53150: variable 'ansible_connection' from source: unknown 15794 1726882658.53161: variable 'ansible_module_compression' from source: unknown 15794 1726882658.53170: variable 'ansible_shell_type' from source: unknown 15794 1726882658.53183: variable 'ansible_shell_executable' from source: unknown 15794 1726882658.53193: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882658.53204: variable 'ansible_pipelining' from source: unknown 15794 1726882658.53214: variable 'ansible_timeout' from source: unknown 15794 1726882658.53227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882658.53659: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882658.53841: variable 'omit' from source: magic vars 15794 1726882658.53844: starting attempt loop 15794 1726882658.53847: running the handler 15794 1726882658.53849: variable 'ansible_facts' from source: unknown 15794 1726882658.53851: _low_level_execute_command(): starting 15794 1726882658.53853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882658.55192: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882658.55364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882658.55383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882658.55406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882658.55500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882658.57274: stdout chunk (state=3): >>>/root <<< 15794 1726882658.57446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882658.57465: stdout chunk (state=3): >>><<< 15794 1726882658.57484: stderr chunk (state=3): >>><<< 15794 1726882658.57513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882658.57628: _low_level_execute_command(): starting 15794 1726882658.57633: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082 `" && echo ansible-tmp-1726882658.5752103-17872-235076825004082="` echo /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082 `" ) && sleep 0' 15794 1726882658.58225: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882658.58267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882658.58297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882658.58405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882658.58438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882658.58547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882658.60499: stdout chunk (state=3): >>>ansible-tmp-1726882658.5752103-17872-235076825004082=/root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082 <<< 15794 1726882658.60687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882658.60690: stdout chunk (state=3): >>><<< 15794 1726882658.60693: stderr chunk (state=3): >>><<< 15794 1726882658.60710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882658.5752103-17872-235076825004082=/root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882658.60747: variable 'ansible_module_compression' from source: unknown 15794 1726882658.60805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15794 1726882658.61039: variable 'ansible_facts' from source: unknown 15794 1726882658.61047: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py 15794 1726882658.61281: Sending initial data 15794 1726882658.61292: Sent initial data (154 bytes) 15794 1726882658.61805: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882658.61820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882658.61849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882658.61958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 15794 1726882658.61980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882658.61998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882658.62085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882658.63681: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882658.63727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882658.63811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpx4av21bq /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py <<< 15794 1726882658.63823: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py" <<< 15794 1726882658.63861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpx4av21bq" to remote "/root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py" <<< 15794 1726882658.66605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882658.66761: stderr chunk (state=3): >>><<< 15794 1726882658.66773: stdout chunk (state=3): >>><<< 15794 1726882658.66913: done transferring module to remote 15794 1726882658.66922: _low_level_execute_command(): starting 15794 1726882658.66929: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/ /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py && sleep 0' 15794 1726882658.67518: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882658.67604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882658.67621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882658.67652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882658.67789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882658.67823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882658.67865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882658.69720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882658.69807: stderr chunk (state=3): >>><<< 15794 1726882658.69810: stdout chunk (state=3): >>><<< 15794 1726882658.69827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882658.69951: _low_level_execute_command(): starting 15794 1726882658.69956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/AnsiballZ_setup.py && sleep 0' 15794 1726882658.70513: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882658.70516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882658.70519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882658.70521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882658.70524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882658.70625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882658.70693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882659.39169: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "39", "epoch": "1726882659", "epoch_int": "1726882659", "date": "2024-09-20", "time": "21:37:39", "iso8601_micro": "2024-09-21T01:37:39.013578Z", "iso8601": "2024-09-21T01:37:39Z", "iso8601_basic": "20240920T213739013578", "iso8601_basic_short": "20240920T213739", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LO<<< 15794 1726882659.39191: stdout chunk (state=3): >>>GNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2870, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 847, "free": 2870}, "nocache": {"free": 3476, "used": 241}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 613, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205124096, "block_size": 4096, "block_total": 64483404, "block_available": 61329376, "block_used": 3154028, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.54443359375, "5m": 0.45263671875, "15m": 0.22900390625}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15794 1726882659.41261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882659.41310: stderr chunk (state=3): >>><<< 15794 1726882659.41319: stdout chunk (state=3): >>><<< 15794 1726882659.41373: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "39", "epoch": "1726882659", "epoch_int": "1726882659", "date": "2024-09-20", "time": "21:37:39", "iso8601_micro": "2024-09-21T01:37:39.013578Z", "iso8601": "2024-09-21T01:37:39Z", "iso8601_basic": "20240920T213739013578", "iso8601_basic_short": "20240920T213739", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2870, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 847, "free": 2870}, "nocache": {"free": 3476, "used": 241}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 613, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251205124096, "block_size": 4096, "block_total": 64483404, "block_available": 61329376, "block_used": 3154028, "inode_total": 16384000, "inode_available": 16303774, "inode_used": 80226, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.54443359375, "5m": 0.45263671875, "15m": 0.22900390625}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882659.41914: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882659.41917: _low_level_execute_command(): starting 15794 1726882659.41920: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882658.5752103-17872-235076825004082/ > /dev/null 2>&1 && sleep 0' 15794 1726882659.42608: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882659.42652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882659.42670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882659.42782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882659.42797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882659.42894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882659.44870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882659.44881: stdout chunk (state=3): >>><<< 15794 1726882659.44896: stderr chunk (state=3): >>><<< 15794 1726882659.44922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882659.44939: handler run complete 15794 1726882659.45135: variable 'ansible_facts' from source: unknown 15794 1726882659.45269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.45755: variable 'ansible_facts' from source: unknown 15794 1726882659.45888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.46143: attempt loop complete, returning result 15794 1726882659.46147: _execute() done 15794 1726882659.46149: dumping result to json 15794 1726882659.46152: done dumping result, returning 15794 1726882659.46154: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-94e5-e48f-00000000056d] 15794 1726882659.46162: sending task result for task 0affe814-3a2d-94e5-e48f-00000000056d 15794 1726882659.46705: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000056d 15794 1726882659.46708: WORKER PROCESS EXITING ok: [managed_node1] 15794 1726882659.47185: no more pending results, returning what we have 15794 1726882659.47189: results queue empty 15794 1726882659.47190: checking for any_errors_fatal 15794 1726882659.47192: done checking for any_errors_fatal 15794 1726882659.47193: checking for max_fail_percentage 15794 1726882659.47195: done checking for max_fail_percentage 15794 1726882659.47196: checking to see if all hosts have failed and the running result is not ok 15794 1726882659.47197: done checking to see if all hosts have failed 15794 1726882659.47198: getting the remaining hosts for this loop 15794 1726882659.47200: done getting the remaining hosts for this loop 15794 1726882659.47204: getting the next task for host managed_node1 15794 1726882659.47210: done getting next task for host managed_node1 15794 1726882659.47212: ^ task is: TASK: meta (flush_handlers) 15794 1726882659.47214: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882659.47219: getting variables 15794 1726882659.47221: in VariableManager get_vars() 15794 1726882659.47361: Calling all_inventory to load vars for managed_node1 15794 1726882659.47365: Calling groups_inventory to load vars for managed_node1 15794 1726882659.47369: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882659.47380: Calling all_plugins_play to load vars for managed_node1 15794 1726882659.47384: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882659.47388: Calling groups_plugins_play to load vars for managed_node1 15794 1726882659.49683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.52673: done with get_vars() 15794 1726882659.52708: done getting variables 15794 1726882659.52793: in VariableManager get_vars() 15794 1726882659.52805: Calling all_inventory to load vars for managed_node1 15794 1726882659.52808: Calling groups_inventory to load vars for managed_node1 15794 1726882659.52811: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882659.52817: Calling all_plugins_play to load vars for managed_node1 15794 1726882659.52820: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882659.52823: Calling groups_plugins_play to load vars for managed_node1 15794 1726882659.56731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.63808: done with get_vars() 15794 1726882659.63862: done queuing things up, now waiting for results queue to drain 15794 1726882659.63865: results queue empty 15794 1726882659.63866: checking for any_errors_fatal 15794 1726882659.63873: done checking for any_errors_fatal 15794 1726882659.63874: checking for max_fail_percentage 15794 1726882659.63875: done checking for max_fail_percentage 15794 1726882659.63876: checking to see if all hosts have failed and the running result is not ok 15794 1726882659.63877: done checking to see if all hosts have failed 15794 1726882659.63883: getting the remaining hosts for this loop 15794 1726882659.63884: done getting the remaining hosts for this loop 15794 1726882659.63888: getting the next task for host managed_node1 15794 1726882659.63893: done getting next task for host managed_node1 15794 1726882659.63896: ^ task is: TASK: Verify network state restored to default 15794 1726882659.63898: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882659.63901: getting variables 15794 1726882659.63902: in VariableManager get_vars() 15794 1726882659.63916: Calling all_inventory to load vars for managed_node1 15794 1726882659.63919: Calling groups_inventory to load vars for managed_node1 15794 1726882659.63922: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882659.63929: Calling all_plugins_play to load vars for managed_node1 15794 1726882659.63932: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882659.64142: Calling groups_plugins_play to load vars for managed_node1 15794 1726882659.68171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.72498: done with get_vars() 15794 1726882659.72539: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 21:37:39 -0400 (0:00:01.226) 0:00:57.284 ****** 15794 1726882659.72638: entering _queue_task() for managed_node1/include_tasks 15794 1726882659.73258: worker is 1 (out of 1 available) 15794 1726882659.73269: exiting _queue_task() for managed_node1/include_tasks 15794 1726882659.73279: done queuing things up, now waiting for results queue to drain 15794 1726882659.73281: waiting for pending results... 15794 1726882659.73367: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 15794 1726882659.73511: in run() - task 0affe814-3a2d-94e5-e48f-000000000078 15794 1726882659.73622: variable 'ansible_search_path' from source: unknown 15794 1726882659.73627: calling self._execute() 15794 1726882659.73692: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882659.73706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882659.73730: variable 'omit' from source: magic vars 15794 1726882659.74196: variable 'ansible_distribution_major_version' from source: facts 15794 1726882659.74214: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882659.74226: _execute() done 15794 1726882659.74236: dumping result to json 15794 1726882659.74245: done dumping result, returning 15794 1726882659.74256: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affe814-3a2d-94e5-e48f-000000000078] 15794 1726882659.74281: sending task result for task 0affe814-3a2d-94e5-e48f-000000000078 15794 1726882659.74482: no more pending results, returning what we have 15794 1726882659.74490: in VariableManager get_vars() 15794 1726882659.74531: Calling all_inventory to load vars for managed_node1 15794 1726882659.74536: Calling groups_inventory to load vars for managed_node1 15794 1726882659.74541: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882659.74556: Calling all_plugins_play to load vars for managed_node1 15794 1726882659.74560: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882659.74564: Calling groups_plugins_play to load vars for managed_node1 15794 1726882659.75151: done sending task result for task 0affe814-3a2d-94e5-e48f-000000000078 15794 1726882659.75155: WORKER PROCESS EXITING 15794 1726882659.77957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.81402: done with get_vars() 15794 1726882659.81439: variable 'ansible_search_path' from source: unknown 15794 1726882659.81456: we have included files to process 15794 1726882659.81457: generating all_blocks data 15794 1726882659.81459: done generating all_blocks data 15794 1726882659.81460: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15794 1726882659.81461: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15794 1726882659.81464: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15794 1726882659.81993: done processing included file 15794 1726882659.81995: iterating over new_blocks loaded from include file 15794 1726882659.81997: in VariableManager get_vars() 15794 1726882659.82012: done with get_vars() 15794 1726882659.82014: filtering new block on tags 15794 1726882659.82038: done filtering new block on tags 15794 1726882659.82041: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 15794 1726882659.82051: extending task lists for all hosts with included blocks 15794 1726882659.82093: done extending task lists 15794 1726882659.82094: done processing included files 15794 1726882659.82095: results queue empty 15794 1726882659.82096: checking for any_errors_fatal 15794 1726882659.82098: done checking for any_errors_fatal 15794 1726882659.82099: checking for max_fail_percentage 15794 1726882659.82100: done checking for max_fail_percentage 15794 1726882659.82102: checking to see if all hosts have failed and the running result is not ok 15794 1726882659.82103: done checking to see if all hosts have failed 15794 1726882659.82104: getting the remaining hosts for this loop 15794 1726882659.82105: done getting the remaining hosts for this loop 15794 1726882659.82109: getting the next task for host managed_node1 15794 1726882659.82113: done getting next task for host managed_node1 15794 1726882659.82116: ^ task is: TASK: Check routes and DNS 15794 1726882659.82119: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882659.82121: getting variables 15794 1726882659.82122: in VariableManager get_vars() 15794 1726882659.82132: Calling all_inventory to load vars for managed_node1 15794 1726882659.82138: Calling groups_inventory to load vars for managed_node1 15794 1726882659.82141: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882659.82147: Calling all_plugins_play to load vars for managed_node1 15794 1726882659.82150: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882659.82159: Calling groups_plugins_play to load vars for managed_node1 15794 1726882659.85455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882659.91099: done with get_vars() 15794 1726882659.91140: done getting variables 15794 1726882659.91189: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:37:39 -0400 (0:00:00.185) 0:00:57.470 ****** 15794 1726882659.91222: entering _queue_task() for managed_node1/shell 15794 1726882659.91981: worker is 1 (out of 1 available) 15794 1726882659.91995: exiting _queue_task() for managed_node1/shell 15794 1726882659.92009: done queuing things up, now waiting for results queue to drain 15794 1726882659.92010: waiting for pending results... 15794 1726882659.92524: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 15794 1726882659.92761: in run() - task 0affe814-3a2d-94e5-e48f-00000000057e 15794 1726882659.93141: variable 'ansible_search_path' from source: unknown 15794 1726882659.93145: variable 'ansible_search_path' from source: unknown 15794 1726882659.93149: calling self._execute() 15794 1726882659.93152: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882659.93155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882659.93157: variable 'omit' from source: magic vars 15794 1726882659.94340: variable 'ansible_distribution_major_version' from source: facts 15794 1726882659.94344: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882659.94347: variable 'omit' from source: magic vars 15794 1726882659.94350: variable 'omit' from source: magic vars 15794 1726882659.94352: variable 'omit' from source: magic vars 15794 1726882659.94355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15794 1726882659.94740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15794 1726882659.94744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15794 1726882659.94747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882659.94749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15794 1726882659.94752: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15794 1726882659.94754: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882659.94757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882659.94981: Set connection var ansible_connection to ssh 15794 1726882659.94997: Set connection var ansible_module_compression to ZIP_DEFLATED 15794 1726882659.95009: Set connection var ansible_pipelining to False 15794 1726882659.95020: Set connection var ansible_shell_executable to /bin/sh 15794 1726882659.95028: Set connection var ansible_shell_type to sh 15794 1726882659.95044: Set connection var ansible_timeout to 10 15794 1726882659.95086: variable 'ansible_shell_executable' from source: unknown 15794 1726882659.95440: variable 'ansible_connection' from source: unknown 15794 1726882659.95444: variable 'ansible_module_compression' from source: unknown 15794 1726882659.95447: variable 'ansible_shell_type' from source: unknown 15794 1726882659.95449: variable 'ansible_shell_executable' from source: unknown 15794 1726882659.95452: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882659.95454: variable 'ansible_pipelining' from source: unknown 15794 1726882659.95456: variable 'ansible_timeout' from source: unknown 15794 1726882659.95458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882659.95563: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882659.95586: variable 'omit' from source: magic vars 15794 1726882659.95597: starting attempt loop 15794 1726882659.95645: running the handler 15794 1726882659.95662: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15794 1726882659.95689: _low_level_execute_command(): starting 15794 1726882659.96000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15794 1726882659.97255: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882659.97273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882659.97450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882659.97528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882659.99298: stdout chunk (state=3): >>>/root <<< 15794 1726882659.99572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882659.99577: stdout chunk (state=3): >>><<< 15794 1726882659.99583: stderr chunk (state=3): >>><<< 15794 1726882659.99608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882659.99635: _low_level_execute_command(): starting 15794 1726882659.99651: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681 `" && echo ansible-tmp-1726882659.99618-17917-27576214329681="` echo /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681 `" ) && sleep 0' 15794 1726882660.01008: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882660.01022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15794 1726882660.01156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15794 1726882660.01175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 15794 1726882660.01201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882660.01331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882660.01357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882660.01455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882660.03458: stdout chunk (state=3): >>>ansible-tmp-1726882659.99618-17917-27576214329681=/root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681 <<< 15794 1726882660.03638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882660.03649: stdout chunk (state=3): >>><<< 15794 1726882660.03661: stderr chunk (state=3): >>><<< 15794 1726882660.03685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882659.99618-17917-27576214329681=/root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882660.03726: variable 'ansible_module_compression' from source: unknown 15794 1726882660.03792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15794pdp21tn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15794 1726882660.03845: variable 'ansible_facts' from source: unknown 15794 1726882660.03930: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py 15794 1726882660.04167: Sending initial data 15794 1726882660.04170: Sent initial data (153 bytes) 15794 1726882660.04753: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882660.04844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882660.04884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882660.04904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882660.04918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882660.05009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882660.06601: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15794 1726882660.06656: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15794 1726882660.06706: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpmec1l3io /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py <<< 15794 1726882660.06729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py" <<< 15794 1726882660.06772: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15794 1726882660.06801: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-15794pdp21tn0/tmpmec1l3io" to remote "/root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py" <<< 15794 1726882660.08053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882660.08069: stderr chunk (state=3): >>><<< 15794 1726882660.08204: stdout chunk (state=3): >>><<< 15794 1726882660.08208: done transferring module to remote 15794 1726882660.08211: _low_level_execute_command(): starting 15794 1726882660.08213: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/ /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py && sleep 0' 15794 1726882660.09216: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882660.09554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882660.09577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882660.09670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882660.11863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882660.11867: stdout chunk (state=3): >>><<< 15794 1726882660.11869: stderr chunk (state=3): >>><<< 15794 1726882660.11978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882660.11983: _low_level_execute_command(): starting 15794 1726882660.11987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/AnsiballZ_command.py && sleep 0' 15794 1726882660.12954: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882660.12969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15794 1726882660.13091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882660.13105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882660.13257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882660.13275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882660.13338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882660.31319: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8c:42:87:d8:29 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3002sec preferred_lft 3002sec\n inet6 fe80::bb10:9a17:6b35:7604/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:37:40.302227", "end": "2024-09-20 21:37:40.311031", "delta": "0:00:00.008804", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15794 1726882660.32926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 15794 1726882660.33140: stderr chunk (state=3): >>><<< 15794 1726882660.33144: stdout chunk (state=3): >>><<< 15794 1726882660.33149: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8c:42:87:d8:29 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3002sec preferred_lft 3002sec\n inet6 fe80::bb10:9a17:6b35:7604/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:37:40.302227", "end": "2024-09-20 21:37:40.311031", "delta": "0:00:00.008804", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. 15794 1726882660.33157: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15794 1726882660.33159: _low_level_execute_command(): starting 15794 1726882660.33162: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882659.99618-17917-27576214329681/ > /dev/null 2>&1 && sleep 0' 15794 1726882660.33870: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 15794 1726882660.33884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15794 1726882660.33950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15794 1726882660.34009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15794 1726882660.34032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15794 1726882660.34055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15794 1726882660.34147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15794 1726882660.36176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15794 1726882660.36189: stdout chunk (state=3): >>><<< 15794 1726882660.36202: stderr chunk (state=3): >>><<< 15794 1726882660.36223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15794 1726882660.36239: handler run complete 15794 1726882660.36285: Evaluated conditional (False): False 15794 1726882660.36307: attempt loop complete, returning result 15794 1726882660.36319: _execute() done 15794 1726882660.36327: dumping result to json 15794 1726882660.36344: done dumping result, returning 15794 1726882660.36358: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0affe814-3a2d-94e5-e48f-00000000057e] 15794 1726882660.36368: sending task result for task 0affe814-3a2d-94e5-e48f-00000000057e ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008804", "end": "2024-09-20 21:37:40.311031", "rc": 0, "start": "2024-09-20 21:37:40.302227" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:8c:42:87:d8:29 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3002sec preferred_lft 3002sec inet6 fe80::bb10:9a17:6b35:7604/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.217 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 15794 1726882660.36706: no more pending results, returning what we have 15794 1726882660.36711: results queue empty 15794 1726882660.36713: checking for any_errors_fatal 15794 1726882660.36715: done checking for any_errors_fatal 15794 1726882660.36716: checking for max_fail_percentage 15794 1726882660.36719: done checking for max_fail_percentage 15794 1726882660.36720: checking to see if all hosts have failed and the running result is not ok 15794 1726882660.36721: done checking to see if all hosts have failed 15794 1726882660.36722: getting the remaining hosts for this loop 15794 1726882660.36724: done getting the remaining hosts for this loop 15794 1726882660.36730: getting the next task for host managed_node1 15794 1726882660.36741: done getting next task for host managed_node1 15794 1726882660.36745: ^ task is: TASK: Verify DNS and network connectivity 15794 1726882660.36748: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882660.36757: getting variables 15794 1726882660.36759: in VariableManager get_vars() 15794 1726882660.36795: Calling all_inventory to load vars for managed_node1 15794 1726882660.36798: Calling groups_inventory to load vars for managed_node1 15794 1726882660.36803: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882660.36976: Calling all_plugins_play to load vars for managed_node1 15794 1726882660.36981: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882660.36988: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000057e 15794 1726882660.36991: WORKER PROCESS EXITING 15794 1726882660.36996: Calling groups_plugins_play to load vars for managed_node1 15794 1726882660.39788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882660.42959: done with get_vars() 15794 1726882660.43001: done getting variables 15794 1726882660.43072: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:37:40 -0400 (0:00:00.518) 0:00:57.989 ****** 15794 1726882660.43115: entering _queue_task() for managed_node1/shell 15794 1726882660.43482: worker is 1 (out of 1 available) 15794 1726882660.43495: exiting _queue_task() for managed_node1/shell 15794 1726882660.43509: done queuing things up, now waiting for results queue to drain 15794 1726882660.43510: waiting for pending results... 15794 1726882660.43829: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 15794 1726882660.44140: in run() - task 0affe814-3a2d-94e5-e48f-00000000057f 15794 1726882660.44144: variable 'ansible_search_path' from source: unknown 15794 1726882660.44147: variable 'ansible_search_path' from source: unknown 15794 1726882660.44150: calling self._execute() 15794 1726882660.44165: variable 'ansible_host' from source: host vars for 'managed_node1' 15794 1726882660.44178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15794 1726882660.44195: variable 'omit' from source: magic vars 15794 1726882660.44851: variable 'ansible_distribution_major_version' from source: facts 15794 1726882660.44880: Evaluated conditional (ansible_distribution_major_version != '6'): True 15794 1726882660.45078: variable 'ansible_facts' from source: unknown 15794 1726882660.46507: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 15794 1726882660.46586: when evaluation is False, skipping this task 15794 1726882660.46590: _execute() done 15794 1726882660.46593: dumping result to json 15794 1726882660.46595: done dumping result, returning 15794 1726882660.46598: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0affe814-3a2d-94e5-e48f-00000000057f] 15794 1726882660.46600: sending task result for task 0affe814-3a2d-94e5-e48f-00000000057f 15794 1726882660.46674: done sending task result for task 0affe814-3a2d-94e5-e48f-00000000057f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 15794 1726882660.46735: no more pending results, returning what we have 15794 1726882660.46741: results queue empty 15794 1726882660.46742: checking for any_errors_fatal 15794 1726882660.46751: done checking for any_errors_fatal 15794 1726882660.46752: checking for max_fail_percentage 15794 1726882660.46754: done checking for max_fail_percentage 15794 1726882660.46755: checking to see if all hosts have failed and the running result is not ok 15794 1726882660.46756: done checking to see if all hosts have failed 15794 1726882660.46757: getting the remaining hosts for this loop 15794 1726882660.46759: done getting the remaining hosts for this loop 15794 1726882660.46764: getting the next task for host managed_node1 15794 1726882660.46773: done getting next task for host managed_node1 15794 1726882660.46775: ^ task is: TASK: meta (flush_handlers) 15794 1726882660.46779: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882660.46783: getting variables 15794 1726882660.46785: in VariableManager get_vars() 15794 1726882660.46819: Calling all_inventory to load vars for managed_node1 15794 1726882660.46822: Calling groups_inventory to load vars for managed_node1 15794 1726882660.46826: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882660.46943: Calling all_plugins_play to load vars for managed_node1 15794 1726882660.46947: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882660.46952: Calling groups_plugins_play to load vars for managed_node1 15794 1726882660.47702: WORKER PROCESS EXITING 15794 1726882660.49341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882660.52489: done with get_vars() 15794 1726882660.52526: done getting variables 15794 1726882660.52615: in VariableManager get_vars() 15794 1726882660.52637: Calling all_inventory to load vars for managed_node1 15794 1726882660.52640: Calling groups_inventory to load vars for managed_node1 15794 1726882660.52644: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882660.52650: Calling all_plugins_play to load vars for managed_node1 15794 1726882660.52654: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882660.52658: Calling groups_plugins_play to load vars for managed_node1 15794 1726882660.55311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882660.58857: done with get_vars() 15794 1726882660.58899: done queuing things up, now waiting for results queue to drain 15794 1726882660.58901: results queue empty 15794 1726882660.58903: checking for any_errors_fatal 15794 1726882660.58906: done checking for any_errors_fatal 15794 1726882660.58907: checking for max_fail_percentage 15794 1726882660.58908: done checking for max_fail_percentage 15794 1726882660.58909: checking to see if all hosts have failed and the running result is not ok 15794 1726882660.58910: done checking to see if all hosts have failed 15794 1726882660.58911: getting the remaining hosts for this loop 15794 1726882660.58913: done getting the remaining hosts for this loop 15794 1726882660.58916: getting the next task for host managed_node1 15794 1726882660.58921: done getting next task for host managed_node1 15794 1726882660.58923: ^ task is: TASK: meta (flush_handlers) 15794 1726882660.58925: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882660.58928: getting variables 15794 1726882660.58929: in VariableManager get_vars() 15794 1726882660.58940: Calling all_inventory to load vars for managed_node1 15794 1726882660.58943: Calling groups_inventory to load vars for managed_node1 15794 1726882660.58947: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882660.58953: Calling all_plugins_play to load vars for managed_node1 15794 1726882660.58956: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882660.58960: Calling groups_plugins_play to load vars for managed_node1 15794 1726882660.61021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882660.64965: done with get_vars() 15794 1726882660.64999: done getting variables 15794 1726882660.65172: in VariableManager get_vars() 15794 1726882660.65185: Calling all_inventory to load vars for managed_node1 15794 1726882660.65188: Calling groups_inventory to load vars for managed_node1 15794 1726882660.65191: Calling all_plugins_inventory to load vars for managed_node1 15794 1726882660.65198: Calling all_plugins_play to load vars for managed_node1 15794 1726882660.65201: Calling groups_plugins_inventory to load vars for managed_node1 15794 1726882660.65205: Calling groups_plugins_play to load vars for managed_node1 15794 1726882660.69520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15794 1726882660.74751: done with get_vars() 15794 1726882660.74802: done queuing things up, now waiting for results queue to drain 15794 1726882660.74805: results queue empty 15794 1726882660.74806: checking for any_errors_fatal 15794 1726882660.74807: done checking for any_errors_fatal 15794 1726882660.74808: checking for max_fail_percentage 15794 1726882660.74810: done checking for max_fail_percentage 15794 1726882660.74811: checking to see if all hosts have failed and the running result is not ok 15794 1726882660.74812: done checking to see if all hosts have failed 15794 1726882660.74813: getting the remaining hosts for this loop 15794 1726882660.74814: done getting the remaining hosts for this loop 15794 1726882660.74823: getting the next task for host managed_node1 15794 1726882660.74827: done getting next task for host managed_node1 15794 1726882660.74828: ^ task is: None 15794 1726882660.74830: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15794 1726882660.74832: done queuing things up, now waiting for results queue to drain 15794 1726882660.74833: results queue empty 15794 1726882660.74836: checking for any_errors_fatal 15794 1726882660.74837: done checking for any_errors_fatal 15794 1726882660.74838: checking for max_fail_percentage 15794 1726882660.74839: done checking for max_fail_percentage 15794 1726882660.74840: checking to see if all hosts have failed and the running result is not ok 15794 1726882660.74841: done checking to see if all hosts have failed 15794 1726882660.74842: getting the next task for host managed_node1 15794 1726882660.74845: done getting next task for host managed_node1 15794 1726882660.74846: ^ task is: None 15794 1726882660.74847: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=82 changed=3 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Friday 20 September 2024 21:37:40 -0400 (0:00:00.318) 0:00:58.307 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.53s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.41s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.24s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 2.10s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 2.04s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Install iproute --------------------------------------------------------- 1.87s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.77s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.60s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.38s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.34s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.34s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.30s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.26s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.23s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Create veth interface lsr27 --------------------------------------------- 1.20s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.18s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 0.99s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 15794 1726882660.75032: RUNNING CLEANUP