[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18714 1726853402.98684: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18714 1726853402.99559: Added group all to inventory 18714 1726853402.99562: Added group ungrouped to inventory 18714 1726853402.99566: Group all now contains ungrouped 18714 1726853402.99569: Examining possible inventory source: /tmp/network-iHm/inventory.yml 18714 1726853403.25516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18714 1726853403.25576: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18714 1726853403.25598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18714 1726853403.25652: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18714 1726853403.25727: Loaded config def from plugin (inventory/script) 18714 1726853403.25729: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18714 1726853403.25769: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18714 1726853403.25857: Loaded config def from plugin (inventory/yaml) 18714 1726853403.25859: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18714 1726853403.25944: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18714 1726853403.26352: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18714 1726853403.26355: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18714 1726853403.26358: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18714 1726853403.26363: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18714 1726853403.26368: Loading data from /tmp/network-iHm/inventory.yml 18714 1726853403.26434: /tmp/network-iHm/inventory.yml was not parsable by auto 18714 1726853403.26496: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18714 1726853403.26534: Loading data from /tmp/network-iHm/inventory.yml 18714 1726853403.26615: group all already in inventory 18714 1726853403.26621: set inventory_file for managed_node1 18714 1726853403.26626: set inventory_dir for managed_node1 18714 1726853403.26627: Added host managed_node1 to inventory 18714 1726853403.26629: Added host managed_node1 to group all 18714 1726853403.26630: set ansible_host for managed_node1 18714 1726853403.26631: set ansible_ssh_extra_args for managed_node1 18714 1726853403.26634: set inventory_file for managed_node2 18714 1726853403.26637: set inventory_dir for managed_node2 18714 1726853403.26637: Added host managed_node2 to inventory 18714 1726853403.26639: Added host managed_node2 to group all 18714 1726853403.26640: set ansible_host for managed_node2 18714 1726853403.26640: set ansible_ssh_extra_args for managed_node2 18714 1726853403.26643: set inventory_file for managed_node3 18714 1726853403.26645: set inventory_dir for managed_node3 18714 1726853403.26646: Added host managed_node3 to inventory 18714 1726853403.26647: Added host managed_node3 to group all 18714 1726853403.26648: set ansible_host for managed_node3 18714 1726853403.26648: set ansible_ssh_extra_args for managed_node3 18714 1726853403.26650: Reconcile groups and hosts in inventory. 18714 1726853403.26654: Group ungrouped now contains managed_node1 18714 1726853403.26656: Group ungrouped now contains managed_node2 18714 1726853403.26658: Group ungrouped now contains managed_node3 18714 1726853403.26730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18714 1726853403.26849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18714 1726853403.26896: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18714 1726853403.26923: Loaded config def from plugin (vars/host_group_vars) 18714 1726853403.26925: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18714 1726853403.26931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18714 1726853403.26938: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18714 1726853403.26981: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18714 1726853403.27697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853403.27786: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18714 1726853403.27825: Loaded config def from plugin (connection/local) 18714 1726853403.27829: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18714 1726853403.29069: Loaded config def from plugin (connection/paramiko_ssh) 18714 1726853403.29074: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18714 1726853403.30444: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18714 1726853403.30480: Loaded config def from plugin (connection/psrp) 18714 1726853403.30487: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18714 1726853403.31473: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18714 1726853403.31628: Loaded config def from plugin (connection/ssh) 18714 1726853403.31632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18714 1726853403.35903: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18714 1726853403.36080: Loaded config def from plugin (connection/winrm) 18714 1726853403.36083: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18714 1726853403.36114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18714 1726853403.36381: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18714 1726853403.36449: Loaded config def from plugin (shell/cmd) 18714 1726853403.36452: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18714 1726853403.36583: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18714 1726853403.36653: Loaded config def from plugin (shell/powershell) 18714 1726853403.36655: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18714 1726853403.36825: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18714 1726853403.37165: Loaded config def from plugin (shell/sh) 18714 1726853403.37167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18714 1726853403.37203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18714 1726853403.37441: Loaded config def from plugin (become/runas) 18714 1726853403.37444: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18714 1726853403.37848: Loaded config def from plugin (become/su) 18714 1726853403.37850: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18714 1726853403.38016: Loaded config def from plugin (become/sudo) 18714 1726853403.38019: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18714 1726853403.38051: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18714 1726853403.38392: in VariableManager get_vars() 18714 1726853403.38413: done with get_vars() 18714 1726853403.38545: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18714 1726853403.41493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18714 1726853403.41612: in VariableManager get_vars() 18714 1726853403.41617: done with get_vars() 18714 1726853403.41620: variable 'playbook_dir' from source: magic vars 18714 1726853403.41621: variable 'ansible_playbook_python' from source: magic vars 18714 1726853403.41622: variable 'ansible_config_file' from source: magic vars 18714 1726853403.41622: variable 'groups' from source: magic vars 18714 1726853403.41623: variable 'omit' from source: magic vars 18714 1726853403.41624: variable 'ansible_version' from source: magic vars 18714 1726853403.41624: variable 'ansible_check_mode' from source: magic vars 18714 1726853403.41625: variable 'ansible_diff_mode' from source: magic vars 18714 1726853403.41626: variable 'ansible_forks' from source: magic vars 18714 1726853403.41626: variable 'ansible_inventory_sources' from source: magic vars 18714 1726853403.41627: variable 'ansible_skip_tags' from source: magic vars 18714 1726853403.41628: variable 'ansible_limit' from source: magic vars 18714 1726853403.41628: variable 'ansible_run_tags' from source: magic vars 18714 1726853403.41629: variable 'ansible_verbosity' from source: magic vars 18714 1726853403.41661: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18714 1726853403.42319: in VariableManager get_vars() 18714 1726853403.42335: done with get_vars() 18714 1726853403.42376: in VariableManager get_vars() 18714 1726853403.42397: done with get_vars() 18714 1726853403.42431: in VariableManager get_vars() 18714 1726853403.42442: done with get_vars() 18714 1726853403.42468: in VariableManager get_vars() 18714 1726853403.42484: done with get_vars() 18714 1726853403.42553: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18714 1726853403.42755: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18714 1726853403.43238: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18714 1726853403.43881: in VariableManager get_vars() 18714 1726853403.43900: done with get_vars() 18714 1726853403.44311: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18714 1726853403.44448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18714 1726853403.45690: in VariableManager get_vars() 18714 1726853403.45708: done with get_vars() 18714 1726853403.45833: in VariableManager get_vars() 18714 1726853403.45837: done with get_vars() 18714 1726853403.45839: variable 'playbook_dir' from source: magic vars 18714 1726853403.45840: variable 'ansible_playbook_python' from source: magic vars 18714 1726853403.45841: variable 'ansible_config_file' from source: magic vars 18714 1726853403.45841: variable 'groups' from source: magic vars 18714 1726853403.45842: variable 'omit' from source: magic vars 18714 1726853403.45843: variable 'ansible_version' from source: magic vars 18714 1726853403.45843: variable 'ansible_check_mode' from source: magic vars 18714 1726853403.45844: variable 'ansible_diff_mode' from source: magic vars 18714 1726853403.45845: variable 'ansible_forks' from source: magic vars 18714 1726853403.45846: variable 'ansible_inventory_sources' from source: magic vars 18714 1726853403.45846: variable 'ansible_skip_tags' from source: magic vars 18714 1726853403.45847: variable 'ansible_limit' from source: magic vars 18714 1726853403.45848: variable 'ansible_run_tags' from source: magic vars 18714 1726853403.45848: variable 'ansible_verbosity' from source: magic vars 18714 1726853403.45885: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18714 1726853403.45956: in VariableManager get_vars() 18714 1726853403.45959: done with get_vars() 18714 1726853403.45961: variable 'playbook_dir' from source: magic vars 18714 1726853403.45962: variable 'ansible_playbook_python' from source: magic vars 18714 1726853403.45963: variable 'ansible_config_file' from source: magic vars 18714 1726853403.45964: variable 'groups' from source: magic vars 18714 1726853403.45964: variable 'omit' from source: magic vars 18714 1726853403.45965: variable 'ansible_version' from source: magic vars 18714 1726853403.45966: variable 'ansible_check_mode' from source: magic vars 18714 1726853403.45966: variable 'ansible_diff_mode' from source: magic vars 18714 1726853403.45967: variable 'ansible_forks' from source: magic vars 18714 1726853403.45968: variable 'ansible_inventory_sources' from source: magic vars 18714 1726853403.45968: variable 'ansible_skip_tags' from source: magic vars 18714 1726853403.45969: variable 'ansible_limit' from source: magic vars 18714 1726853403.45970: variable 'ansible_run_tags' from source: magic vars 18714 1726853403.45972: variable 'ansible_verbosity' from source: magic vars 18714 1726853403.46002: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18714 1726853403.46083: in VariableManager get_vars() 18714 1726853403.46094: done with get_vars() 18714 1726853403.46133: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18714 1726853403.46246: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18714 1726853403.46326: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18714 1726853403.46805: in VariableManager get_vars() 18714 1726853403.46823: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18714 1726853403.48382: in VariableManager get_vars() 18714 1726853403.48398: done with get_vars() 18714 1726853403.48427: in VariableManager get_vars() 18714 1726853403.48430: done with get_vars() 18714 1726853403.48432: variable 'playbook_dir' from source: magic vars 18714 1726853403.48432: variable 'ansible_playbook_python' from source: magic vars 18714 1726853403.48433: variable 'ansible_config_file' from source: magic vars 18714 1726853403.48434: variable 'groups' from source: magic vars 18714 1726853403.48434: variable 'omit' from source: magic vars 18714 1726853403.48435: variable 'ansible_version' from source: magic vars 18714 1726853403.48435: variable 'ansible_check_mode' from source: magic vars 18714 1726853403.48436: variable 'ansible_diff_mode' from source: magic vars 18714 1726853403.48437: variable 'ansible_forks' from source: magic vars 18714 1726853403.48437: variable 'ansible_inventory_sources' from source: magic vars 18714 1726853403.48438: variable 'ansible_skip_tags' from source: magic vars 18714 1726853403.48439: variable 'ansible_limit' from source: magic vars 18714 1726853403.48439: variable 'ansible_run_tags' from source: magic vars 18714 1726853403.48440: variable 'ansible_verbosity' from source: magic vars 18714 1726853403.48694: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18714 1726853403.48765: in VariableManager get_vars() 18714 1726853403.48779: done with get_vars() 18714 1726853403.48819: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18714 1726853403.53124: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18714 1726853403.53514: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18714 1726853403.54256: in VariableManager get_vars() 18714 1726853403.54281: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18714 1726853403.57520: in VariableManager get_vars() 18714 1726853403.57535: done with get_vars() 18714 1726853403.57576: in VariableManager get_vars() 18714 1726853403.57589: done with get_vars() 18714 1726853403.57647: in VariableManager get_vars() 18714 1726853403.57662: done with get_vars() 18714 1726853403.57965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18714 1726853403.57981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18714 1726853403.58620: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18714 1726853403.58787: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18714 1726853403.58790: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 18714 1726853403.58822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18714 1726853403.58848: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18714 1726853403.59228: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18714 1726853403.59495: Loaded config def from plugin (callback/default) 18714 1726853403.59497: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18714 1726853403.61341: Loaded config def from plugin (callback/junit) 18714 1726853403.61344: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18714 1726853403.61395: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18714 1726853403.61469: Loaded config def from plugin (callback/minimal) 18714 1726853403.61473: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18714 1726853403.61513: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18714 1726853403.61582: Loaded config def from plugin (callback/tree) 18714 1726853403.61584: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18714 1726853403.61740: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18714 1726853403.61743: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18714 1726853403.61780: in VariableManager get_vars() 18714 1726853403.61795: done with get_vars() 18714 1726853403.61801: in VariableManager get_vars() 18714 1726853403.61809: done with get_vars() 18714 1726853403.61813: variable 'omit' from source: magic vars 18714 1726853403.61853: in VariableManager get_vars() 18714 1726853403.61872: done with get_vars() 18714 1726853403.61894: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 18714 1726853403.62467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18714 1726853403.62543: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18714 1726853403.62579: getting the remaining hosts for this loop 18714 1726853403.62581: done getting the remaining hosts for this loop 18714 1726853403.62584: getting the next task for host managed_node1 18714 1726853403.62588: done getting next task for host managed_node1 18714 1726853403.62590: ^ task is: TASK: Gathering Facts 18714 1726853403.62591: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853403.62599: getting variables 18714 1726853403.62600: in VariableManager get_vars() 18714 1726853403.62610: Calling all_inventory to load vars for managed_node1 18714 1726853403.62613: Calling groups_inventory to load vars for managed_node1 18714 1726853403.62615: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853403.62632: Calling all_plugins_play to load vars for managed_node1 18714 1726853403.62644: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853403.62648: Calling groups_plugins_play to load vars for managed_node1 18714 1726853403.62686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853403.62745: done with get_vars() 18714 1726853403.62755: done getting variables 18714 1726853403.62822: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 13:30:03 -0400 (0:00:00.011) 0:00:00.011 ****** 18714 1726853403.62848: entering _queue_task() for managed_node1/gather_facts 18714 1726853403.62852: Creating lock for gather_facts 18714 1726853403.63240: worker is 1 (out of 1 available) 18714 1726853403.63254: exiting _queue_task() for managed_node1/gather_facts 18714 1726853403.63266: done queuing things up, now waiting for results queue to drain 18714 1726853403.63268: waiting for pending results... 18714 1726853403.63688: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853403.63694: in run() - task 02083763-bbaf-e784-4f7d-00000000007c 18714 1726853403.63697: variable 'ansible_search_path' from source: unknown 18714 1726853403.63699: calling self._execute() 18714 1726853403.63736: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853403.63746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853403.63761: variable 'omit' from source: magic vars 18714 1726853403.63868: variable 'omit' from source: magic vars 18714 1726853403.63902: variable 'omit' from source: magic vars 18714 1726853403.63953: variable 'omit' from source: magic vars 18714 1726853403.64004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853403.64052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853403.64143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853403.64147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853403.64153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853403.64158: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853403.64167: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853403.64178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853403.64293: Set connection var ansible_shell_executable to /bin/sh 18714 1726853403.64306: Set connection var ansible_timeout to 10 18714 1726853403.64315: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853403.64326: Set connection var ansible_connection to ssh 18714 1726853403.64335: Set connection var ansible_shell_type to sh 18714 1726853403.64344: Set connection var ansible_pipelining to False 18714 1726853403.64380: variable 'ansible_shell_executable' from source: unknown 18714 1726853403.64389: variable 'ansible_connection' from source: unknown 18714 1726853403.64472: variable 'ansible_module_compression' from source: unknown 18714 1726853403.64475: variable 'ansible_shell_type' from source: unknown 18714 1726853403.64478: variable 'ansible_shell_executable' from source: unknown 18714 1726853403.64481: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853403.64483: variable 'ansible_pipelining' from source: unknown 18714 1726853403.64485: variable 'ansible_timeout' from source: unknown 18714 1726853403.64487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853403.64618: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853403.64635: variable 'omit' from source: magic vars 18714 1726853403.64643: starting attempt loop 18714 1726853403.64652: running the handler 18714 1726853403.64675: variable 'ansible_facts' from source: unknown 18714 1726853403.64704: _low_level_execute_command(): starting 18714 1726853403.64718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853403.65493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853403.65573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853403.65589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853403.65611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853403.65697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853403.67424: stdout chunk (state=3): >>>/root <<< 18714 1726853403.67557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853403.67597: stdout chunk (state=3): >>><<< 18714 1726853403.67601: stderr chunk (state=3): >>><<< 18714 1726853403.67728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853403.67733: _low_level_execute_command(): starting 18714 1726853403.67736: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176 `" && echo ansible-tmp-1726853403.676455-18735-161115597916176="` echo /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176 `" ) && sleep 0' 18714 1726853403.69778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853403.69846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853403.69913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853403.70168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853403.70195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853403.70277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853403.72205: stdout chunk (state=3): >>>ansible-tmp-1726853403.676455-18735-161115597916176=/root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176 <<< 18714 1726853403.72397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853403.72407: stdout chunk (state=3): >>><<< 18714 1726853403.72417: stderr chunk (state=3): >>><<< 18714 1726853403.72460: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853403.676455-18735-161115597916176=/root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853403.72658: variable 'ansible_module_compression' from source: unknown 18714 1726853403.72662: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18714 1726853403.72665: ANSIBALLZ: Acquiring lock 18714 1726853403.72667: ANSIBALLZ: Lock acquired: 139791971422656 18714 1726853403.72669: ANSIBALLZ: Creating module 18714 1726853404.28277: ANSIBALLZ: Writing module into payload 18714 1726853404.28305: ANSIBALLZ: Writing module 18714 1726853404.28326: ANSIBALLZ: Renaming module 18714 1726853404.28331: ANSIBALLZ: Done creating module 18714 1726853404.28359: variable 'ansible_facts' from source: unknown 18714 1726853404.28366: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853404.28377: _low_level_execute_command(): starting 18714 1726853404.28383: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18714 1726853404.29488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853404.29626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853404.29643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853404.29647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853404.29770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853404.31464: stdout chunk (state=3): >>>PLATFORM <<< 18714 1726853404.31629: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18714 1726853404.31699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853404.31777: stderr chunk (state=3): >>><<< 18714 1726853404.31878: stdout chunk (state=3): >>><<< 18714 1726853404.31898: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853404.31909 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 18714 1726853404.31977: _low_level_execute_command(): starting 18714 1726853404.31980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 18714 1726853404.32380: Sending initial data 18714 1726853404.32384: Sent initial data (1181 bytes) 18714 1726853404.33251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853404.33255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853404.33330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853404.33336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853404.33341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853404.33421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853404.33431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853404.33433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853404.33746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853404.37210: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18714 1726853404.37724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853404.37729: stderr chunk (state=3): >>><<< 18714 1726853404.37731: stdout chunk (state=3): >>><<< 18714 1726853404.37743: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853404.37934: variable 'ansible_facts' from source: unknown 18714 1726853404.37938: variable 'ansible_facts' from source: unknown 18714 1726853404.37952: variable 'ansible_module_compression' from source: unknown 18714 1726853404.38112: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853404.38140: variable 'ansible_facts' from source: unknown 18714 1726853404.38499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py 18714 1726853404.38869: Sending initial data 18714 1726853404.38874: Sent initial data (153 bytes) 18714 1726853404.40086: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853404.40190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853404.40275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853404.41882: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18714 1726853404.42017: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853404.42056: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp0wfex0n8 /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py <<< 18714 1726853404.42059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py" <<< 18714 1726853404.42098: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp0wfex0n8" to remote "/root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py" <<< 18714 1726853404.44758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853404.44846: stderr chunk (state=3): >>><<< 18714 1726853404.44887: stdout chunk (state=3): >>><<< 18714 1726853404.44911: done transferring module to remote 18714 1726853404.44925: _low_level_execute_command(): starting 18714 1726853404.44931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/ /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py && sleep 0' 18714 1726853404.45947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853404.45953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853404.45956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853404.45958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853404.46021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853404.46034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853404.46142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853404.48056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853404.48061: stdout chunk (state=3): >>><<< 18714 1726853404.48063: stderr chunk (state=3): >>><<< 18714 1726853404.48087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853404.48091: _low_level_execute_command(): starting 18714 1726853404.48105: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/AnsiballZ_setup.py && sleep 0' 18714 1726853404.49528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853404.49532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853404.49536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853404.49538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853404.49540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853404.49674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853404.49830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853404.52175: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 18714 1726853404.52233: stdout chunk (state=3): >>>import '_io' # <<< 18714 1726853404.52237: stdout chunk (state=3): >>>import 'marshal' # <<< 18714 1726853404.52301: stdout chunk (state=3): >>>import 'posix' # <<< 18714 1726853404.52304: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 18714 1726853404.52416: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 18714 1726853404.52427: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 18714 1726853404.52443: stdout chunk (state=3): >>>import 'codecs' # <<< 18714 1726853404.52517: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18714 1726853404.52539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 18714 1726853404.52543: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72158bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 18714 1726853404.52561: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215bea50> <<< 18714 1726853404.52634: stdout chunk (state=3): >>>import '_signal' # <<< 18714 1726853404.52656: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 18714 1726853404.52660: stdout chunk (state=3): >>>import '_stat' # <<< 18714 1726853404.52738: stdout chunk (state=3): >>>import 'stat' # <<< 18714 1726853404.52759: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18714 1726853404.52860: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 18714 1726853404.52884: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 18714 1726853404.52911: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 18714 1726853404.52916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 18714 1726853404.52959: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215cd130> <<< 18714 1726853404.53084: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18714 1726853404.53460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18714 1726853404.53481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18714 1726853404.53676: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 18714 1726853404.53680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853404.53758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213ebda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213ebfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18714 1726853404.53809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853404.53844: stdout chunk (state=3): >>>import 'itertools' # <<< 18714 1726853404.54082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 18714 1726853404.54093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214237a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721423e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721403a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721401190> <<< 18714 1726853404.54102: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213e8f50> <<< 18714 1726853404.54174: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 18714 1726853404.54181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18714 1726853404.54184: stdout chunk (state=3): >>>import '_sre' # <<< 18714 1726853404.54186: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18714 1726853404.54223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 18714 1726853404.54228: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 18714 1726853404.54262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18714 1726853404.54276: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721443710> <<< 18714 1726853404.54289: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721442330> <<< 18714 1726853404.54317: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 18714 1726853404.54363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721402060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213ea810> <<< 18714 1726853404.54395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18714 1726853404.54484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 18714 1726853404.54491: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214787a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213e81d0> <<< 18714 1726853404.54494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.54712: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721478c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721478b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721478ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213e6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214795b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721479280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72147a4b0> import 'importlib.util' # import 'runpy' # <<< 18714 1726853404.54726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18714 1726853404.54888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214906e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721491df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721492c60> <<< 18714 1726853404.54984: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7214932c0> <<< 18714 1726853404.54990: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214921b0> <<< 18714 1726853404.55014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721493d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721493470> <<< 18714 1726853404.55091: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72147a510> <<< 18714 1726853404.55095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 18714 1726853404.55136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18714 1726853404.55139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18714 1726853404.55363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721193b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bc620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bc3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bc650> <<< 18714 1726853404.55370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.55500: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bcf80> <<< 18714 1726853404.55625: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bd970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bc830> <<< 18714 1726853404.55631: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721191d30> <<< 18714 1726853404.55733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bed20> <<< 18714 1726853404.55743: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bda90> <<< 18714 1726853404.55763: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72147ac00> <<< 18714 1726853404.55788: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18714 1726853404.55854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853404.55865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18714 1726853404.55898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18714 1726853404.55981: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211eb080> <<< 18714 1726853404.56004: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18714 1726853404.56029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18714 1726853404.56091: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72120b410> <<< 18714 1726853404.56097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18714 1726853404.56164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18714 1726853404.56218: stdout chunk (state=3): >>>import 'ntpath' # <<< 18714 1726853404.56276: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72126c1d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18714 1726853404.56302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18714 1726853404.56323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18714 1726853404.56414: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72126e930> <<< 18714 1726853404.56511: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72126c2f0> <<< 18714 1726853404.56519: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7212391f0> <<< 18714 1726853404.56651: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72120a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bfc50> <<< 18714 1726853404.56749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18714 1726853404.56954: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa72120a330> <<< 18714 1726853404.57198: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1vdtrasz/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 18714 1726853404.57340: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.57383: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18714 1726853404.57401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18714 1726853404.57464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18714 1726853404.57577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18714 1726853404.57685: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b8afc0> import '_typing' # <<< 18714 1726853404.57915: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b69eb0> <<< 18714 1726853404.57937: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b69010> # zipimport: zlib available <<< 18714 1726853404.57976: stdout chunk (state=3): >>>import 'ansible' # <<< 18714 1726853404.58000: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.58023: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.58056: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.58082: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 18714 1726853404.58186: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.60197: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.61579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b88e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18714 1726853404.61613: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.61631: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720bbe9f0> <<< 18714 1726853404.61688: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbe780> <<< 18714 1726853404.61712: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbe090> <<< 18714 1726853404.61726: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 18714 1726853404.61785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbe4e0> <<< 18714 1726853404.61829: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b8bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.61832: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720bbf770> <<< 18714 1726853404.61889: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720bbf9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18714 1726853404.61936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18714 1726853404.61984: stdout chunk (state=3): >>>import '_locale' # <<< 18714 1726853404.62028: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbfef0><<< 18714 1726853404.62062: stdout chunk (state=3): >>> import 'pwd' # <<< 18714 1726853404.62092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18714 1726853404.62163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18714 1726853404.62166: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a25d00> <<< 18714 1726853404.62225: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a27920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 18714 1726853404.62245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18714 1726853404.62289: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a282f0> <<< 18714 1726853404.62309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18714 1726853404.62365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a29490> <<< 18714 1726853404.62393: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18714 1726853404.62455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18714 1726853404.62545: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a2bf80> <<< 18714 1726853404.62602: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a302c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a2a240> <<< 18714 1726853404.62613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 18714 1726853404.62642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 18714 1726853404.62716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18714 1726853404.62893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 18714 1726853404.62921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a33fb0> <<< 18714 1726853404.62938: stdout chunk (state=3): >>>import '_tokenize' # <<< 18714 1726853404.62995: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a32a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a327e0> <<< 18714 1726853404.63088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18714 1726853404.63140: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a32d50> <<< 18714 1726853404.63166: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a2a750> <<< 18714 1726853404.63201: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a78200> <<< 18714 1726853404.63265: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a783b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18714 1726853404.63386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18714 1726853404.63390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a79e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a79be0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18714 1726853404.63409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18714 1726853404.63464: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a7c350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a7a480> <<< 18714 1726853404.63539: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853404.63576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 18714 1726853404.63598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18714 1726853404.63643: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a7fad0> <<< 18714 1726853404.63931: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a7c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.63959: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a80e00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a7d010> <<< 18714 1726853404.64006: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.64039: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a80c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a78530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 18714 1726853404.64084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18714 1726853404.64128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.64175: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72090c290> <<< 18714 1726853404.64603: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72090d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a82a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a83dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a82630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.64652: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.64695: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 18714 1726853404.64731: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18714 1726853404.64842: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.64945: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.65501: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.66041: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18714 1726853404.66066: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18714 1726853404.66134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7209116a0> <<< 18714 1726853404.66224: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 18714 1726853404.66398: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209124e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72090d580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18714 1726853404.66484: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.66645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720912c00> <<< 18714 1726853404.66663: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.67102: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.67541: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.67616: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.67696: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 18714 1726853404.67793: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 18714 1726853404.67853: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.67941: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18714 1726853404.67991: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18714 1726853404.68117: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.68122: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 18714 1726853404.68293: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.68518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18714 1726853404.68590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18714 1726853404.68659: stdout chunk (state=3): >>>import '_ast' # <<< 18714 1726853404.68681: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209137a0> # zipimport: zlib available <<< 18714 1726853404.68747: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.68891: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18714 1726853404.68895: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 18714 1726853404.68977: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.68994: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.69028: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.69109: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.69320: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72091e300> <<< 18714 1726853404.69331: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209199d0> <<< 18714 1726853404.69357: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18714 1726853404.69420: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.69482: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.69514: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.69636: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18714 1726853404.69676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18714 1726853404.69700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18714 1726853404.69727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18714 1726853404.69761: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a06c30> <<< 18714 1726853404.69803: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720afe900> <<< 18714 1726853404.69890: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72091e450> <<< 18714 1726853404.69962: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72090d850> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.69965: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18714 1726853404.70041: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 18714 1726853404.70086: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18714 1726853404.70113: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70206: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70209: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70221: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70253: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70295: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70330: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 18714 1726853404.70397: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70476: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70577: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 18714 1726853404.70592: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.70756: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.71243: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 18714 1726853404.71252: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b22d0> <<< 18714 1726853404.71255: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 18714 1726853404.71257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 18714 1726853404.71307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205bc260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.71558: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7205bc4d0> <<< 18714 1726853404.71570: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720998560> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b2e70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b09b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b13a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18714 1726853404.71575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 18714 1726853404.71577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 18714 1726853404.71580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 18714 1726853404.71616: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7205bf530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205bede0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7205befc0> <<< 18714 1726853404.71659: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205be210> <<< 18714 1726853404.71662: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 18714 1726853404.71921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205bf6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7206221e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720620200> <<< 18714 1726853404.71924: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b0620> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 18714 1726853404.71955: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 18714 1726853404.71978: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.72121: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 18714 1726853404.72145: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.72209: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 18714 1726853404.72240: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 18714 1726853404.72264: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.72433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.72469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.72490: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 18714 1726853404.72554: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.72607: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.72669: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.72731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 18714 1726853404.72749: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.73495: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.73633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 18714 1726853404.73647: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.73690: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.73741: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.73843: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 18714 1726853404.73864: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.73901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 18714 1726853404.74273: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 18714 1726853404.74388: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720623590> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18714 1726853404.74603: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720622cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.74793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.75047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.75069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18714 1726853404.75103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18714 1726853404.75179: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.75232: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72065e390> <<< 18714 1726853404.75420: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72064e180> import 'ansible.module_utils.facts.system.python' # <<< 18714 1726853404.75477: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.75605: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.75615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 18714 1726853404.75632: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.75710: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.75828: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.75961: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 18714 1726853404.76062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 18714 1726853404.76065: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76248: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 18714 1726853404.76255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720672030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720671c70> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 18714 1726853404.76292: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76300: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 18714 1726853404.76598: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.76842: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76892: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.76929: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 18714 1726853404.77182: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.77267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 18714 1726853404.77270: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.77389: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.77506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 18714 1726853404.77558: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.77662: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.78148: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.78652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 18714 1726853404.78693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 18714 1726853404.78767: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.78863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 18714 1726853404.79096: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 18714 1726853404.79217: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.79373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 18714 1726853404.79396: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 18714 1726853404.79417: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.79537: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 18714 1726853404.79596: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.79688: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.79892: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 18714 1726853404.80104: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80136: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80201: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.80229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 18714 1726853404.80310: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80383: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 18714 1726853404.80392: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80423: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 18714 1726853404.80656: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.80664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.80895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 18714 1726853404.80934: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.81199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 18714 1726853404.81329: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 18714 1726853404.81332: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.81405: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 18714 1726853404.81502: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.81505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 18714 1726853404.81545: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.81548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 18714 1726853404.81752: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 18714 1726853404.81755: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.81851: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.81854: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 18714 1726853404.82053: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.82074: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.82308: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18714 1726853404.82428: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.82631: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 18714 1726853404.82792: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853404.82815: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 18714 1726853404.83085: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 18714 1726853404.83088: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853404.83288: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 18714 1726853404.83891: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 18714 1726853404.83916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18714 1726853404.83933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 18714 1726853404.83942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 18714 1726853404.83976: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853404.83994: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72040ab40> <<< 18714 1726853404.84002: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720408680> <<< 18714 1726853404.84115: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72040b0e0> <<< 18714 1726853404.97403: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720450650> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7204516d0> <<< 18714 1726853404.97445: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 18714 1726853404.97458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853404.97494: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 18714 1726853404.97498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7204538f0> <<< 18714 1726853404.97529: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7204525d0> <<< 18714 1726853404.97856: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18714 1726853405.22872: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "St<<< 18714 1726853405.22944: stdout chunk (state=3): >>>ream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "04", "epoch": "1726853404", "epoch_int": "1726853404", "date": "2024-09-20", "time": "13:30:04", "iso8601_micro": "2024-09-20T17:30:04.848518Z", "iso8601": "2024-09-20T17:30:04Z", "iso8601_basic": "20240920T133004848518", "iso8601_basic_short": "20240920T133004", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.50634765625, "5m": 0.373046875, "15m": 0.17431640625}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 571, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794738176, "block_size": 4096, "block_total": 65519099, "block_available": 63914731, "block_used": 1604368, "inode_total": 131070960, "inode_available": 131029065, "inode_used": 41895, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_<<< 18714 1726853405.22954: stdout chunk (state=3): >>>fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853405.23717: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks <<< 18714 1726853405.23723: stdout chunk (state=3): >>># clear builtins._ <<< 18714 1726853405.23726: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__<<< 18714 1726853405.23729: stdout chunk (state=3): >>> # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 18714 1726853405.23740: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 18714 1726853405.23774: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 18714 1726853405.23780: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig <<< 18714 1726853405.23791: stdout chunk (state=3): >>># cleanup[2] removing _distutils_hack <<< 18714 1726853405.23863: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 <<< 18714 1726853405.23869: stdout chunk (state=3): >>># cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437<<< 18714 1726853405.23983: stdout chunk (state=3): >>> # cleanup[2] removing collections.abc <<< 18714 1726853405.24025: stdout chunk (state=3): >>># cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 18714 1726853405.24051: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue<<< 18714 1726853405.24063: stdout chunk (state=3): >>> # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other<<< 18714 1726853405.24101: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils <<< 18714 1726853405.24111: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl<<< 18714 1726853405.24176: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version <<< 18714 1726853405.24212: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys<<< 18714 1726853405.24216: stdout chunk (state=3): >>> # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl<<< 18714 1726853405.24222: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 18714 1726853405.24718: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 18714 1726853405.24737: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18714 1726853405.24770: stdout chunk (state=3): >>># destroy _bz2 <<< 18714 1726853405.24777: stdout chunk (state=3): >>># destroy _compression # destroy _lzma <<< 18714 1726853405.24963: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 18714 1726853405.24989: stdout chunk (state=3): >>># destroy selinux <<< 18714 1726853405.24995: stdout chunk (state=3): >>># destroy shutil <<< 18714 1726853405.25020: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 18714 1726853405.25131: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 18714 1726853405.25153: stdout chunk (state=3): >>># destroy shlex <<< 18714 1726853405.25162: stdout chunk (state=3): >>># destroy fcntl <<< 18714 1726853405.25166: stdout chunk (state=3): >>># destroy datetime <<< 18714 1726853405.25196: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 18714 1726853405.25231: stdout chunk (state=3): >>># destroy _ssl <<< 18714 1726853405.25237: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 18714 1726853405.25316: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 18714 1726853405.25377: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 18714 1726853405.25383: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 18714 1726853405.25400: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves<<< 18714 1726853405.25416: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 18714 1726853405.25423: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 18714 1726853405.25577: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 18714 1726853405.25583: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18714 1726853405.25787: stdout chunk (state=3): >>># destroy sys.monitoring <<< 18714 1726853405.25793: stdout chunk (state=3): >>># destroy _socket <<< 18714 1726853405.25818: stdout chunk (state=3): >>># destroy _collections <<< 18714 1726853405.25859: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 18714 1726853405.25872: stdout chunk (state=3): >>># destroy tokenize<<< 18714 1726853405.25880: stdout chunk (state=3): >>> <<< 18714 1726853405.25975: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18714 1726853405.26000: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules<<< 18714 1726853405.26006: stdout chunk (state=3): >>> # destroy _frozen_importlib <<< 18714 1726853405.26113: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 18714 1726853405.26119: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io <<< 18714 1726853405.26182: stdout chunk (state=3): >>># destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 18714 1726853405.26210: stdout chunk (state=3): >>># destroy _hashlib <<< 18714 1726853405.26231: stdout chunk (state=3): >>># destroy _operator # destroy _sre <<< 18714 1726853405.26237: stdout chunk (state=3): >>># destroy _string # destroy re <<< 18714 1726853405.26333: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18714 1726853405.26878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853405.26881: stdout chunk (state=3): >>><<< 18714 1726853405.26883: stderr chunk (state=3): >>><<< 18714 1726853405.27003: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72158bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7215cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213ebda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213ebfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214237a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721423e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721403a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721401190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213e8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721443710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721442330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721402060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213ea810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214787a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213e81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721478c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721478b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721478ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7213e6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214795b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721479280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72147a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214906e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721491df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721492c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7214932c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7214921b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721493d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721493470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72147a510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa721193b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bc620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bc3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bc650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bcf80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7211bd970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bc830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa721191d30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bda90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72147ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211eb080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72120b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72126c1d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72126e930> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72126c2f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7212391f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72120a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7211bfc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa72120a330> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1vdtrasz/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b8afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b69eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b69010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b88e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720bbe9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbe780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbe090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbe4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720b8bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720bbf770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720bbf9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720bbfef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a25d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a27920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a282f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a29490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a2bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a302c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a2a240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a33fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a32a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a327e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a32d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a2a750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a78200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a783b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a79e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a79be0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a7c350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a7a480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a7fad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a7c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a80e00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a7d010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a80c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a78530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72090c290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72090d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a82a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720a83dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a82630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7209116a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209124e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72090d580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720912c00> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209137a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72091e300> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209199d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720a06c30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720afe900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72091e450> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72090d850> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b22d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205bc260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7205bc4d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720998560> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b2e70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b09b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b13a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7205bf530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205bede0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7205befc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205be210> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7205bf6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7206221e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720620200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7209b0620> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720623590> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720622cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72065e390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72064e180> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa720672030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720671c70> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa72040ab40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720408680> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa72040b0e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa720450650> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7204516d0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7204538f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7204525d0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "04", "epoch": "1726853404", "epoch_int": "1726853404", "date": "2024-09-20", "time": "13:30:04", "iso8601_micro": "2024-09-20T17:30:04.848518Z", "iso8601": "2024-09-20T17:30:04Z", "iso8601_basic": "20240920T133004848518", "iso8601_basic_short": "20240920T133004", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.50634765625, "5m": 0.373046875, "15m": 0.17431640625}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 571, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794738176, "block_size": 4096, "block_total": 65519099, "block_available": 63914731, "block_used": 1604368, "inode_total": 131070960, "inode_available": 131029065, "inode_used": 41895, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18714 1726853405.29127: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853405.29172: _low_level_execute_command(): starting 18714 1726853405.29175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853403.676455-18735-161115597916176/ > /dev/null 2>&1 && sleep 0' 18714 1726853405.29864: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853405.29883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853405.29942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853405.29988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853405.30006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853405.30026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853405.30107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853405.32001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853405.32004: stdout chunk (state=3): >>><<< 18714 1726853405.32007: stderr chunk (state=3): >>><<< 18714 1726853405.32123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853405.32127: handler run complete 18714 1726853405.32184: variable 'ansible_facts' from source: unknown 18714 1726853405.32297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.32627: variable 'ansible_facts' from source: unknown 18714 1726853405.32713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.32867: attempt loop complete, returning result 18714 1726853405.32881: _execute() done 18714 1726853405.32900: dumping result to json 18714 1726853405.32934: done dumping result, returning 18714 1726853405.32946: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-00000000007c] 18714 1726853405.32954: sending task result for task 02083763-bbaf-e784-4f7d-00000000007c 18714 1726853405.34062: done sending task result for task 02083763-bbaf-e784-4f7d-00000000007c ok: [managed_node1] 18714 1726853405.34199: no more pending results, returning what we have 18714 1726853405.34202: results queue empty 18714 1726853405.34203: checking for any_errors_fatal 18714 1726853405.34204: done checking for any_errors_fatal 18714 1726853405.34205: checking for max_fail_percentage 18714 1726853405.34206: done checking for max_fail_percentage 18714 1726853405.34207: checking to see if all hosts have failed and the running result is not ok 18714 1726853405.34208: done checking to see if all hosts have failed 18714 1726853405.34209: getting the remaining hosts for this loop 18714 1726853405.34210: done getting the remaining hosts for this loop 18714 1726853405.34213: getting the next task for host managed_node1 18714 1726853405.34219: done getting next task for host managed_node1 18714 1726853405.34221: ^ task is: TASK: meta (flush_handlers) 18714 1726853405.34222: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853405.34226: getting variables 18714 1726853405.34228: in VariableManager get_vars() 18714 1726853405.34247: Calling all_inventory to load vars for managed_node1 18714 1726853405.34250: Calling groups_inventory to load vars for managed_node1 18714 1726853405.34253: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853405.34262: Calling all_plugins_play to load vars for managed_node1 18714 1726853405.34265: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853405.34267: Calling groups_plugins_play to load vars for managed_node1 18714 1726853405.34279: WORKER PROCESS EXITING 18714 1726853405.34469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.34683: done with get_vars() 18714 1726853405.34694: done getting variables 18714 1726853405.34768: in VariableManager get_vars() 18714 1726853405.34780: Calling all_inventory to load vars for managed_node1 18714 1726853405.34782: Calling groups_inventory to load vars for managed_node1 18714 1726853405.34785: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853405.34789: Calling all_plugins_play to load vars for managed_node1 18714 1726853405.34791: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853405.34795: Calling groups_plugins_play to load vars for managed_node1 18714 1726853405.34940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.35136: done with get_vars() 18714 1726853405.35148: done queuing things up, now waiting for results queue to drain 18714 1726853405.35150: results queue empty 18714 1726853405.35151: checking for any_errors_fatal 18714 1726853405.35153: done checking for any_errors_fatal 18714 1726853405.35154: checking for max_fail_percentage 18714 1726853405.35155: done checking for max_fail_percentage 18714 1726853405.35169: checking to see if all hosts have failed and the running result is not ok 18714 1726853405.35170: done checking to see if all hosts have failed 18714 1726853405.35173: getting the remaining hosts for this loop 18714 1726853405.35174: done getting the remaining hosts for this loop 18714 1726853405.35177: getting the next task for host managed_node1 18714 1726853405.35182: done getting next task for host managed_node1 18714 1726853405.35184: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18714 1726853405.35186: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853405.35188: getting variables 18714 1726853405.35189: in VariableManager get_vars() 18714 1726853405.35197: Calling all_inventory to load vars for managed_node1 18714 1726853405.35198: Calling groups_inventory to load vars for managed_node1 18714 1726853405.35201: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853405.35206: Calling all_plugins_play to load vars for managed_node1 18714 1726853405.35208: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853405.35211: Calling groups_plugins_play to load vars for managed_node1 18714 1726853405.35365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.35560: done with get_vars() 18714 1726853405.35567: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 13:30:05 -0400 (0:00:01.727) 0:00:01.739 ****** 18714 1726853405.35645: entering _queue_task() for managed_node1/include_tasks 18714 1726853405.35647: Creating lock for include_tasks 18714 1726853405.36105: worker is 1 (out of 1 available) 18714 1726853405.36115: exiting _queue_task() for managed_node1/include_tasks 18714 1726853405.36125: done queuing things up, now waiting for results queue to drain 18714 1726853405.36126: waiting for pending results... 18714 1726853405.36239: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 18714 1726853405.36348: in run() - task 02083763-bbaf-e784-4f7d-000000000006 18714 1726853405.36380: variable 'ansible_search_path' from source: unknown 18714 1726853405.36420: calling self._execute() 18714 1726853405.36505: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853405.36515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853405.36529: variable 'omit' from source: magic vars 18714 1726853405.36637: _execute() done 18714 1726853405.36645: dumping result to json 18714 1726853405.36653: done dumping result, returning 18714 1726853405.36664: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-e784-4f7d-000000000006] 18714 1726853405.36686: sending task result for task 02083763-bbaf-e784-4f7d-000000000006 18714 1726853405.36856: done sending task result for task 02083763-bbaf-e784-4f7d-000000000006 18714 1726853405.36859: WORKER PROCESS EXITING 18714 1726853405.36930: no more pending results, returning what we have 18714 1726853405.36934: in VariableManager get_vars() 18714 1726853405.36965: Calling all_inventory to load vars for managed_node1 18714 1726853405.36968: Calling groups_inventory to load vars for managed_node1 18714 1726853405.36973: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853405.36987: Calling all_plugins_play to load vars for managed_node1 18714 1726853405.36990: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853405.36993: Calling groups_plugins_play to load vars for managed_node1 18714 1726853405.37340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.37565: done with get_vars() 18714 1726853405.37574: variable 'ansible_search_path' from source: unknown 18714 1726853405.37588: we have included files to process 18714 1726853405.37589: generating all_blocks data 18714 1726853405.37590: done generating all_blocks data 18714 1726853405.37591: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18714 1726853405.37592: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18714 1726853405.37595: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18714 1726853405.38279: in VariableManager get_vars() 18714 1726853405.38295: done with get_vars() 18714 1726853405.38316: done processing included file 18714 1726853405.38319: iterating over new_blocks loaded from include file 18714 1726853405.38320: in VariableManager get_vars() 18714 1726853405.38329: done with get_vars() 18714 1726853405.38331: filtering new block on tags 18714 1726853405.38344: done filtering new block on tags 18714 1726853405.38347: in VariableManager get_vars() 18714 1726853405.38356: done with get_vars() 18714 1726853405.38358: filtering new block on tags 18714 1726853405.38375: done filtering new block on tags 18714 1726853405.38378: in VariableManager get_vars() 18714 1726853405.38388: done with get_vars() 18714 1726853405.38389: filtering new block on tags 18714 1726853405.38401: done filtering new block on tags 18714 1726853405.38403: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 18714 1726853405.38408: extending task lists for all hosts with included blocks 18714 1726853405.38470: done extending task lists 18714 1726853405.38474: done processing included files 18714 1726853405.38475: results queue empty 18714 1726853405.38475: checking for any_errors_fatal 18714 1726853405.38477: done checking for any_errors_fatal 18714 1726853405.38477: checking for max_fail_percentage 18714 1726853405.38478: done checking for max_fail_percentage 18714 1726853405.38479: checking to see if all hosts have failed and the running result is not ok 18714 1726853405.38480: done checking to see if all hosts have failed 18714 1726853405.38481: getting the remaining hosts for this loop 18714 1726853405.38482: done getting the remaining hosts for this loop 18714 1726853405.38484: getting the next task for host managed_node1 18714 1726853405.38488: done getting next task for host managed_node1 18714 1726853405.38490: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18714 1726853405.38493: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853405.38495: getting variables 18714 1726853405.38496: in VariableManager get_vars() 18714 1726853405.38504: Calling all_inventory to load vars for managed_node1 18714 1726853405.38506: Calling groups_inventory to load vars for managed_node1 18714 1726853405.38508: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853405.38513: Calling all_plugins_play to load vars for managed_node1 18714 1726853405.38515: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853405.38517: Calling groups_plugins_play to load vars for managed_node1 18714 1726853405.38686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853405.38883: done with get_vars() 18714 1726853405.38891: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:30:05 -0400 (0:00:00.033) 0:00:01.772 ****** 18714 1726853405.38951: entering _queue_task() for managed_node1/setup 18714 1726853405.39221: worker is 1 (out of 1 available) 18714 1726853405.39234: exiting _queue_task() for managed_node1/setup 18714 1726853405.39244: done queuing things up, now waiting for results queue to drain 18714 1726853405.39245: waiting for pending results... 18714 1726853405.39527: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 18714 1726853405.39542: in run() - task 02083763-bbaf-e784-4f7d-00000000008d 18714 1726853405.39561: variable 'ansible_search_path' from source: unknown 18714 1726853405.39577: variable 'ansible_search_path' from source: unknown 18714 1726853405.39623: calling self._execute() 18714 1726853405.39732: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853405.39736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853405.39738: variable 'omit' from source: magic vars 18714 1726853405.40199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853405.42339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853405.42478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853405.42482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853405.42506: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853405.42542: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853405.42624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853405.42639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853405.42669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853405.42735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853405.42738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853405.42921: variable 'ansible_facts' from source: unknown 18714 1726853405.43077: variable 'network_test_required_facts' from source: task vars 18714 1726853405.43080: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 18714 1726853405.43082: variable 'omit' from source: magic vars 18714 1726853405.43084: variable 'omit' from source: magic vars 18714 1726853405.43086: variable 'omit' from source: magic vars 18714 1726853405.43093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853405.43127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853405.43148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853405.43172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853405.43189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853405.43220: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853405.43229: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853405.43237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853405.43337: Set connection var ansible_shell_executable to /bin/sh 18714 1726853405.43349: Set connection var ansible_timeout to 10 18714 1726853405.43358: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853405.43368: Set connection var ansible_connection to ssh 18714 1726853405.43380: Set connection var ansible_shell_type to sh 18714 1726853405.43392: Set connection var ansible_pipelining to False 18714 1726853405.43420: variable 'ansible_shell_executable' from source: unknown 18714 1726853405.43430: variable 'ansible_connection' from source: unknown 18714 1726853405.43436: variable 'ansible_module_compression' from source: unknown 18714 1726853405.43442: variable 'ansible_shell_type' from source: unknown 18714 1726853405.43448: variable 'ansible_shell_executable' from source: unknown 18714 1726853405.43454: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853405.43461: variable 'ansible_pipelining' from source: unknown 18714 1726853405.43467: variable 'ansible_timeout' from source: unknown 18714 1726853405.43477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853405.43622: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853405.43663: variable 'omit' from source: magic vars 18714 1726853405.43667: starting attempt loop 18714 1726853405.43669: running the handler 18714 1726853405.43680: _low_level_execute_command(): starting 18714 1726853405.43683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853405.44761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853405.44780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853405.44873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853405.44890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853405.44918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853405.44934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853405.44959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853405.45043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18714 1726853405.46997: stdout chunk (state=3): >>>/root <<< 18714 1726853405.47090: stdout chunk (state=3): >>><<< 18714 1726853405.47100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853405.47110: stderr chunk (state=3): >>><<< 18714 1726853405.47186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18714 1726853405.47197: _low_level_execute_command(): starting 18714 1726853405.47200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026 `" && echo ansible-tmp-1726853405.4714053-18792-183197885081026="` echo /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026 `" ) && sleep 0' 18714 1726853405.47952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853405.47968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853405.47985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853405.48005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853405.48048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853405.48084: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853405.48166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853405.48187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853405.48212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853405.48236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853405.48324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18714 1726853405.51132: stdout chunk (state=3): >>>ansible-tmp-1726853405.4714053-18792-183197885081026=/root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026 <<< 18714 1726853405.51292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853405.51380: stderr chunk (state=3): >>><<< 18714 1726853405.51383: stdout chunk (state=3): >>><<< 18714 1726853405.51599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853405.4714053-18792-183197885081026=/root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18714 1726853405.51603: variable 'ansible_module_compression' from source: unknown 18714 1726853405.51605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853405.51607: variable 'ansible_facts' from source: unknown 18714 1726853405.52059: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py 18714 1726853405.52414: Sending initial data 18714 1726853405.52422: Sent initial data (154 bytes) 18714 1726853405.53055: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853405.53117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853405.53134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853405.53193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853405.53239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18714 1726853405.55555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853405.55603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853405.55696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpws55l7fk /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py <<< 18714 1726853405.55699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py" <<< 18714 1726853405.55746: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpws55l7fk" to remote "/root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py" <<< 18714 1726853405.57676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853405.57777: stderr chunk (state=3): >>><<< 18714 1726853405.57781: stdout chunk (state=3): >>><<< 18714 1726853405.57783: done transferring module to remote 18714 1726853405.57785: _low_level_execute_command(): starting 18714 1726853405.57787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/ /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py && sleep 0' 18714 1726853405.58408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853405.58422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853405.58465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853405.58486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853405.58502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853405.58589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853405.58633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853405.58675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18714 1726853405.61375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853405.61380: stdout chunk (state=3): >>><<< 18714 1726853405.61382: stderr chunk (state=3): >>><<< 18714 1726853405.61567: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18714 1726853405.61574: _low_level_execute_command(): starting 18714 1726853405.61578: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/AnsiballZ_setup.py && sleep 0' 18714 1726853405.62879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853405.62897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853405.62914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853405.63002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18714 1726853405.66127: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18714 1726853405.66158: stdout chunk (state=3): >>>import _imp # builtin <<< 18714 1726853405.66197: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 18714 1726853405.66396: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 18714 1726853405.66419: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 18714 1726853405.66593: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18714 1726853405.66629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 18714 1726853405.66643: stdout chunk (state=3): >>> import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ee04d0><<< 18714 1726853405.66678: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337eafb30><<< 18714 1726853405.66689: stdout chunk (state=3): >>> <<< 18714 1726853405.67004: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 18714 1726853405.67007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 18714 1726853405.67010: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ee2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 18714 1726853405.67089: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18714 1726853405.67133: stdout chunk (state=3): >>>import 'genericpath' # <<< 18714 1726853405.67150: stdout chunk (state=3): >>> import 'posixpath' # <<< 18714 1726853405.67195: stdout chunk (state=3): >>> import 'os' # <<< 18714 1726853405.67230: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 18714 1726853405.67259: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages<<< 18714 1726853405.67288: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 18714 1726853405.67400: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 18714 1726853405.67420: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337c91130><<< 18714 1726853405.67496: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 18714 1726853405.67519: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 18714 1726853405.67693: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337c91fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 18714 1726853405.67781: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information. <<< 18714 1726853405.68278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 18714 1726853405.68310: stdout chunk (state=3): >>> <<< 18714 1726853405.68396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853405.68407: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18714 1726853405.68460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18714 1726853405.68490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18714 1726853405.68526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18714 1726853405.68548: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccfe60> <<< 18714 1726853405.68582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18714 1726853405.68690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 18714 1726853405.68789: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18714 1726853405.68806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853405.68836: stdout chunk (state=3): >>>import 'itertools' # <<< 18714 1726853405.68882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 18714 1726853405.68898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d07890> <<< 18714 1726853405.68932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 18714 1726853405.68962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 18714 1726853405.69193: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d07f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ce7b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ce5250> <<< 18714 1726853405.69282: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccd010> <<< 18714 1726853405.69487: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d27800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d26450><<< 18714 1726853405.69505: stdout chunk (state=3): >>> <<< 18714 1726853405.69805: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ce6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d24cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d5cd10> <<< 18714 1726853405.69817: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5cbc0> <<< 18714 1726853405.69864: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853405.69989: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d5cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 18714 1726853405.70017: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5d370> <<< 18714 1726853405.70040: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 18714 1726853405.70088: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 18714 1726853405.70107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 18714 1726853405.70130: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5e5a0> <<< 18714 1726853405.70159: stdout chunk (state=3): >>>import 'importlib.util' # <<< 18714 1726853405.70170: stdout chunk (state=3): >>>import 'runpy' # <<< 18714 1726853405.70196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18714 1726853405.70252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18714 1726853405.70443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d75e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d76d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d77320> <<< 18714 1726853405.70535: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d76270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 18714 1726853405.70643: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d77da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d774d0> <<< 18714 1726853405.70959: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a67bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a906b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a90410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a906e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853405.71052: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a91010> <<< 18714 1726853405.71394: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a919d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a908c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a65d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a92d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a90e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18714 1726853405.71431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853405.71452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18714 1726853405.71487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18714 1726853405.71511: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337abf080> <<< 18714 1726853405.71570: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18714 1726853405.71599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18714 1726853405.71692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18714 1726853405.71999: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337adf440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b40260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b429c0> <<< 18714 1726853405.72087: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b40380> <<< 18714 1726853405.72116: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b0d250> <<< 18714 1726853405.72154: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337945340> <<< 18714 1726853405.72205: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ade240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a93c50> <<< 18714 1726853405.72499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1337ade840> <<< 18714 1726853405.72625: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_08_10c64/ansible_setup_payload.zip' <<< 18714 1726853405.72645: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.72780: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18714 1726853405.72799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18714 1726853405.72835: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18714 1726853405.72915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18714 1726853405.72954: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379aef90> <<< 18714 1726853405.72957: stdout chunk (state=3): >>>import '_typing' # <<< 18714 1726853405.73392: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133798de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133798d070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 18714 1726853405.75479: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.77035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379ace30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13379de900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379de690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379ddfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18714 1726853405.77064: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379de3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ee29c0> <<< 18714 1726853405.77696: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13379df680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13379df800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379dfd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133732dac0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133732f6e0> <<< 18714 1726853405.77727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133732ff50> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13373311c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337333cb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337b401d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337331f70> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 18714 1726853405.77752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18714 1726853405.77865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 18714 1726853405.77904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733bc20> import '_tokenize' # <<< 18714 1726853405.78053: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733a6f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733a450> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18714 1726853405.78080: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733a9c0> <<< 18714 1726853405.78107: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337332480> <<< 18714 1726853405.78136: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133737fe90> <<< 18714 1726853405.78176: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133737fef0> <<< 18714 1726853405.78234: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18714 1726853405.78290: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337381ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337381880> <<< 18714 1726853405.78315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18714 1726853405.78332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18714 1726853405.78759: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337383fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337382120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337387620> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13373840e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13373881a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337388380> <<< 18714 1726853405.78821: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337388860> <<< 18714 1726853405.78824: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13373801a0> <<< 18714 1726853405.78852: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18714 1726853405.78887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18714 1726853405.78896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18714 1726853405.78945: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853405.78948: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133738bfb0> <<< 18714 1726853405.79094: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853405.79133: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337215280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133738a750> <<< 18714 1726853405.79165: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133738bb00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133738a3c0> # zipimport: zlib available <<< 18714 1726853405.79456: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 18714 1726853405.79467: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 18714 1726853405.79469: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.79531: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 18714 1726853405.79534: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.79583: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.79815: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.80263: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.80812: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 18714 1726853405.80840: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18714 1726853405.80957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853405.81023: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337219280> <<< 18714 1726853405.81027: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133721a060> <<< 18714 1726853405.81379: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133738b290> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18714 1726853405.81383: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.81436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18714 1726853405.81464: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133721a150> # zipimport: zlib available <<< 18714 1726853405.81926: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.82470: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.82539: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 18714 1726853405.82568: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.82600: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18714 1726853405.82615: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.82696: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.82798: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 18714 1726853405.82814: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 18714 1726853405.82919: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 18714 1726853405.83128: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.83356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18714 1726853405.83421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18714 1726853405.83449: stdout chunk (state=3): >>>import '_ast' # <<< 18714 1726853405.83504: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133721b320> <<< 18714 1726853405.83526: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.83587: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.83688: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 18714 1726853405.83797: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18714 1726853405.84002: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.84024: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18714 1726853405.84053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853405.84154: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337225ee0> <<< 18714 1726853405.84235: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372232f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18714 1726853405.84707: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18714 1726853405.84710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18714 1726853405.84767: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133730e7e0> <<< 18714 1726853405.84826: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a0a4b0> <<< 18714 1726853405.84957: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337225f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337218770> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 18714 1726853405.84994: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.85036: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18714 1726853405.85300: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 18714 1726853405.85303: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.85340: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.85386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.85444: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.85506: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.85633: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 18714 1726853405.85653: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.85741: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.86026: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 18714 1726853405.86311: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.86651: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.86735: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 18714 1726853405.86759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853405.86836: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 18714 1726853405.86855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 18714 1726853405.86967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b5d60> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py<<< 18714 1726853405.87241: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336e97da0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336eac140> <<< 18714 1726853405.87438: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133729ec90> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b6870> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b4470> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b40b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 18714 1726853405.87484: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336eaf1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eaea80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853405.87527: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336eaec60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eadeb0> <<< 18714 1726853405.87540: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 18714 1726853405.87742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 18714 1726853405.87766: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eaf2f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336efddf0> <<< 18714 1726853405.87813: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eafdd0> <<< 18714 1726853405.87825: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b5520> import 'ansible.module_utils.facts.timeout' # <<< 18714 1726853405.87966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 18714 1726853405.87988: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.88014: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 18714 1726853405.88072: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.88138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 18714 1726853405.88192: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 18714 1726853405.88226: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 18714 1726853405.88278: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.88314: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.88354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 18714 1726853405.88404: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.88416: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.88806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 18714 1726853405.89618: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.90309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 18714 1726853405.90333: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.90484: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.90526: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.90576: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 18714 1726853405.90604: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.90648: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.90706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 18714 1726853405.90801: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.90902: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 18714 1726853405.90923: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.91012: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 18714 1726853405.91079: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.91114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 18714 1726853405.91135: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.91268: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.91411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 18714 1726853405.91465: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336effe60> <<< 18714 1726853405.91485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 18714 1726853405.91511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18714 1726853405.91814: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336efe930> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 18714 1726853405.91913: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.92002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 18714 1726853405.92042: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.92110: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.92195: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 18714 1726853405.92290: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18714 1726853405.92402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18714 1726853405.92482: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853405.92500: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336f3a030> <<< 18714 1726853405.92684: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336f283b0> import 'ansible.module_utils.facts.system.python' # <<< 18714 1726853405.92711: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.92715: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.92784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 18714 1726853405.92866: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.92945: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.93063: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.93239: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 18714 1726853405.93353: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 18714 1726853405.93357: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.93403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 18714 1726853405.93556: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336f4dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336f2a870> import 'ansible.module_utils.facts.system.user' # <<< 18714 1726853405.93569: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.93602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 18714 1726853405.93613: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.93769: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.93921: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 18714 1726853405.93930: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.94088: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.94328: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.94331: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.94410: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.94565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 18714 1726853405.94583: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.94759: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.94807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 18714 1726853405.94898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.95456: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.95965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 18714 1726853405.96008: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96119: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 18714 1726853405.96219: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96291: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18714 1726853405.96415: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96566: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 18714 1726853405.96753: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 18714 1726853405.96774: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96801: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 18714 1726853405.96864: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.96952: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.97048: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.97247: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.97495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 18714 1726853405.97513: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.97560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 18714 1726853405.97564: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.97614: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 18714 1726853405.97880: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.97933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.97946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 18714 1726853405.98003: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 18714 1726853405.98435: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 18714 1726853405.98654: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 18714 1726853405.98759: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 18714 1726853405.98839: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98875: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 18714 1726853405.98908: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98959: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.98990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 18714 1726853405.99219: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 18714 1726853405.99223: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 18714 1726853405.99306: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99323: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853405.99368: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99427: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99485: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 18714 1726853405.99586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 18714 1726853405.99641: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99684: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 18714 1726853405.99751: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853405.99891: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 18714 1726853406.00130: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 18714 1726853406.00208: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00295: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # <<< 18714 1726853406.00335: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00376: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 18714 1726853406.00468: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00558: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.00642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18714 1726853406.00709: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.01648: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18714 1726853406.01680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 18714 1726853406.01711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 18714 1726853406.01747: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336d4b680> <<< 18714 1726853406.01765: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336d48920> <<< 18714 1726853406.01788: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336d49910> <<< 18714 1726853406.02217: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "06", "epoch": "1726853406", "epoch_int": "1726853406", "date": "2024-09-20", "time": "13:30:06", "iso8601_micro": "2024-09-20T17:30:06.015622Z", "iso8601": "2024-09-20T17:30:06Z", "iso8601_basic": "20240920T133006015622", "iso8601_basic_short": "20240920T133006", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853406.02927: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 18714 1726853406.02998: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random <<< 18714 1726853406.03109: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 18714 1726853406.03185: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd <<< 18714 1726853406.03206: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 18714 1726853406.03630: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 18714 1726853406.03688: stdout chunk (state=3): >>># destroy ipaddress <<< 18714 1726853406.03724: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 18714 1726853406.03802: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 18714 1726853406.03886: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 18714 1726853406.03957: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 18714 1726853406.04042: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 18714 1726853406.04129: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 18714 1726853406.04194: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 18714 1726853406.04240: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 18714 1726853406.04358: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18714 1726853406.04404: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18714 1726853406.04433: stdout chunk (state=3): >>># destroy _collections <<< 18714 1726853406.04575: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 18714 1726853406.04602: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18714 1726853406.04650: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 18714 1726853406.04729: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 18714 1726853406.04819: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18714 1726853406.05109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853406.05234: stderr chunk (state=3): >>><<< 18714 1726853406.05237: stdout chunk (state=3): >>><<< 18714 1726853406.05518: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ee04d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337eafb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ee2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337c91130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337c91fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccfe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d07890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d07f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ce7b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ce5250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d27800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d26450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ce6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d24cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d5cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d5cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ccadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5e5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d75e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d76d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d77320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d76270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337d77da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d774d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a67bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a906b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a90410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a906e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a91010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337a919d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a908c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a65d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a92d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a90e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337d5e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337abf080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337adf440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b40260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b429c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b40380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337b0d250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337945340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ade240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a93c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1337ade840> # zipimport: found 103 names in '/tmp/ansible_setup_payload_08_10c64/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379aef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133798de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133798d070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379ace30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13379de900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379de690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379ddfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379de3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337ee29c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13379df680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13379df800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13379dfd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133732dac0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133732f6e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133732ff50> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13373311c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337333cb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337b401d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337331f70> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733bc20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733a6f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733a450> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133733a9c0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337332480> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133737fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133737fef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337381ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337381880> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337383fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337382120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337387620> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13373840e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13373881a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337388380> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337388860> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13373801a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133738bfb0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337215280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133738a750> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f133738bb00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133738a3c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337219280> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133721a060> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133738b290> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133721a150> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133721b320> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1337225ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372232f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133730e7e0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337a0a4b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337225f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1337218770> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b5d60> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336e97da0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336eac140> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f133729ec90> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b6870> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b4470> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b40b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336eaf1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eaea80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336eaec60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eadeb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eaf2f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336efddf0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336eafdd0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13372b5520> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336effe60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336efe930> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336f3a030> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336f283b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336f4dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336f2a870> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1336d4b680> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336d48920> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1336d49910> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "06", "epoch": "1726853406", "epoch_int": "1726853406", "date": "2024-09-20", "time": "13:30:06", "iso8601_micro": "2024-09-20T17:30:06.015622Z", "iso8601": "2024-09-20T17:30:06Z", "iso8601_basic": "20240920T133006015622", "iso8601_basic_short": "20240920T133006", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18714 1726853406.06692: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853406.06695: _low_level_execute_command(): starting 18714 1726853406.06698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853405.4714053-18792-183197885081026/ > /dev/null 2>&1 && sleep 0' 18714 1726853406.06977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853406.07013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853406.07016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853406.07019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853406.07021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853406.07024: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853406.07029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.07042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853406.07059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853406.07062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853406.07064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853406.07122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853406.07125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853406.07127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853406.07129: stderr chunk (state=3): >>>debug2: match found <<< 18714 1726853406.07130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.07168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853406.07186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.07195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.07259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.09177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.09190: stdout chunk (state=3): >>><<< 18714 1726853406.09202: stderr chunk (state=3): >>><<< 18714 1726853406.09221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853406.09232: handler run complete 18714 1726853406.09293: variable 'ansible_facts' from source: unknown 18714 1726853406.09350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.09470: variable 'ansible_facts' from source: unknown 18714 1726853406.09545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.09616: attempt loop complete, returning result 18714 1726853406.09623: _execute() done 18714 1726853406.09631: dumping result to json 18714 1726853406.09646: done dumping result, returning 18714 1726853406.09658: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-e784-4f7d-00000000008d] 18714 1726853406.09664: sending task result for task 02083763-bbaf-e784-4f7d-00000000008d 18714 1726853406.10001: done sending task result for task 02083763-bbaf-e784-4f7d-00000000008d 18714 1726853406.10004: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853406.10196: no more pending results, returning what we have 18714 1726853406.10199: results queue empty 18714 1726853406.10200: checking for any_errors_fatal 18714 1726853406.10202: done checking for any_errors_fatal 18714 1726853406.10202: checking for max_fail_percentage 18714 1726853406.10204: done checking for max_fail_percentage 18714 1726853406.10205: checking to see if all hosts have failed and the running result is not ok 18714 1726853406.10205: done checking to see if all hosts have failed 18714 1726853406.10206: getting the remaining hosts for this loop 18714 1726853406.10207: done getting the remaining hosts for this loop 18714 1726853406.10211: getting the next task for host managed_node1 18714 1726853406.10220: done getting next task for host managed_node1 18714 1726853406.10337: ^ task is: TASK: Check if system is ostree 18714 1726853406.10341: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853406.10344: getting variables 18714 1726853406.10346: in VariableManager get_vars() 18714 1726853406.10377: Calling all_inventory to load vars for managed_node1 18714 1726853406.10380: Calling groups_inventory to load vars for managed_node1 18714 1726853406.10384: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.10393: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.10396: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.10399: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.10684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.10804: done with get_vars() 18714 1726853406.10818: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:30:06 -0400 (0:00:00.719) 0:00:02.492 ****** 18714 1726853406.10900: entering _queue_task() for managed_node1/stat 18714 1726853406.11120: worker is 1 (out of 1 available) 18714 1726853406.11133: exiting _queue_task() for managed_node1/stat 18714 1726853406.11144: done queuing things up, now waiting for results queue to drain 18714 1726853406.11145: waiting for pending results... 18714 1726853406.11290: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 18714 1726853406.11354: in run() - task 02083763-bbaf-e784-4f7d-00000000008f 18714 1726853406.11363: variable 'ansible_search_path' from source: unknown 18714 1726853406.11368: variable 'ansible_search_path' from source: unknown 18714 1726853406.11399: calling self._execute() 18714 1726853406.11456: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.11460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.11467: variable 'omit' from source: magic vars 18714 1726853406.11804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853406.11978: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853406.12009: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853406.12055: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853406.12080: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853406.12152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853406.12165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853406.12185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853406.12202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853406.12338: Evaluated conditional (not __network_is_ostree is defined): True 18714 1726853406.12341: variable 'omit' from source: magic vars 18714 1726853406.12359: variable 'omit' from source: magic vars 18714 1726853406.12407: variable 'omit' from source: magic vars 18714 1726853406.12452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853406.12456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853406.12473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853406.12675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853406.12682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853406.12685: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853406.12687: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.12689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.12691: Set connection var ansible_shell_executable to /bin/sh 18714 1726853406.12693: Set connection var ansible_timeout to 10 18714 1726853406.12695: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853406.12697: Set connection var ansible_connection to ssh 18714 1726853406.12699: Set connection var ansible_shell_type to sh 18714 1726853406.12701: Set connection var ansible_pipelining to False 18714 1726853406.12702: variable 'ansible_shell_executable' from source: unknown 18714 1726853406.12704: variable 'ansible_connection' from source: unknown 18714 1726853406.12706: variable 'ansible_module_compression' from source: unknown 18714 1726853406.12712: variable 'ansible_shell_type' from source: unknown 18714 1726853406.12718: variable 'ansible_shell_executable' from source: unknown 18714 1726853406.12724: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.12730: variable 'ansible_pipelining' from source: unknown 18714 1726853406.12735: variable 'ansible_timeout' from source: unknown 18714 1726853406.12742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.12884: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853406.12902: variable 'omit' from source: magic vars 18714 1726853406.12910: starting attempt loop 18714 1726853406.12916: running the handler 18714 1726853406.12936: _low_level_execute_command(): starting 18714 1726853406.12955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853406.13689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.13755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853406.13778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.13801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.13862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.16027: stdout chunk (state=3): >>>/root <<< 18714 1726853406.16188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.16197: stderr chunk (state=3): >>><<< 18714 1726853406.16200: stdout chunk (state=3): >>><<< 18714 1726853406.16220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853406.16243: _low_level_execute_command(): starting 18714 1726853406.16247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513 `" && echo ansible-tmp-1726853406.1622024-18835-142887961312513="` echo /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513 `" ) && sleep 0' 18714 1726853406.16912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.16934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.16986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.19617: stdout chunk (state=3): >>>ansible-tmp-1726853406.1622024-18835-142887961312513=/root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513 <<< 18714 1726853406.19756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.19785: stderr chunk (state=3): >>><<< 18714 1726853406.19789: stdout chunk (state=3): >>><<< 18714 1726853406.19811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853406.1622024-18835-142887961312513=/root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853406.19855: variable 'ansible_module_compression' from source: unknown 18714 1726853406.19900: ANSIBALLZ: Using lock for stat 18714 1726853406.19905: ANSIBALLZ: Acquiring lock 18714 1726853406.19912: ANSIBALLZ: Lock acquired: 139791971424000 18714 1726853406.19914: ANSIBALLZ: Creating module 18714 1726853406.27461: ANSIBALLZ: Writing module into payload 18714 1726853406.27523: ANSIBALLZ: Writing module 18714 1726853406.27540: ANSIBALLZ: Renaming module 18714 1726853406.27544: ANSIBALLZ: Done creating module 18714 1726853406.27565: variable 'ansible_facts' from source: unknown 18714 1726853406.27610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py 18714 1726853406.27713: Sending initial data 18714 1726853406.27716: Sent initial data (153 bytes) 18714 1726853406.28180: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853406.28183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853406.28186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.28188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853406.28190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853406.28192: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.28244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853406.28247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.28250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.28307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.30610: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853406.30653: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853406.30697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmphle83y1t /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py <<< 18714 1726853406.30700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py" <<< 18714 1726853406.30738: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmphle83y1t" to remote "/root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py" <<< 18714 1726853406.30744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py" <<< 18714 1726853406.31276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.31319: stderr chunk (state=3): >>><<< 18714 1726853406.31323: stdout chunk (state=3): >>><<< 18714 1726853406.31356: done transferring module to remote 18714 1726853406.31368: _low_level_execute_command(): starting 18714 1726853406.31377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/ /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py && sleep 0' 18714 1726853406.31827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853406.31831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853406.31833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.31835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853406.31837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853406.31839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.31884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.31905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.31945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.34523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.34549: stderr chunk (state=3): >>><<< 18714 1726853406.34556: stdout chunk (state=3): >>><<< 18714 1726853406.34573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853406.34581: _low_level_execute_command(): starting 18714 1726853406.34587: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/AnsiballZ_stat.py && sleep 0' 18714 1726853406.35046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853406.35050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.35052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853406.35054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.35110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853406.35116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.35119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.35167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.38416: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18714 1726853406.38448: stdout chunk (state=3): >>>import _imp # builtin <<< 18714 1726853406.38486: stdout chunk (state=3): >>>import '_thread' # <<< 18714 1726853406.38490: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 18714 1726853406.38696: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 18714 1726853406.38704: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 18714 1726853406.38884: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18714 1726853406.38915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48502184d0> <<< 18714 1726853406.38927: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48501e7b30> <<< 18714 1726853406.39208: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485021aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # <<< 18714 1726853406.39243: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 18714 1726853406.39298: stdout chunk (state=3): >>>import 'os' # <<< 18714 1726853406.39303: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 18714 1726853406.39340: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 18714 1726853406.39362: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 18714 1726853406.39395: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 18714 1726853406.39406: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 18714 1726853406.39429: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ffe9130> <<< 18714 1726853406.39518: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.39540: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ffe9fa0> <<< 18714 1726853406.39569: stdout chunk (state=3): >>>import 'site' # <<< 18714 1726853406.39619: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18714 1726853406.39985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18714 1726853406.40010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18714 1726853406.40039: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 18714 1726853406.40061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.40088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18714 1726853406.40148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18714 1726853406.40180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18714 1726853406.40215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18714 1726853406.40232: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850027e60> <<< 18714 1726853406.40267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18714 1726853406.40378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850027f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18714 1726853406.40408: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18714 1726853406.40466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.40501: stdout chunk (state=3): >>>import 'itertools' # <<< 18714 1726853406.40536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 18714 1726853406.40540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 18714 1726853406.40544: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485005f890> <<< 18714 1726853406.40574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 18714 1726853406.40592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 18714 1726853406.40607: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485005ff20> <<< 18714 1726853406.40630: stdout chunk (state=3): >>>import '_collections' # <<< 18714 1726853406.40696: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485003fb30> <<< 18714 1726853406.40775: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485003d250> <<< 18714 1726853406.40897: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850025010> <<< 18714 1726853406.40941: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 18714 1726853406.40968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18714 1726853406.40990: stdout chunk (state=3): >>>import '_sre' # <<< 18714 1726853406.41026: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18714 1726853406.41179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485007f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485007e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485003e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485007ccb0> <<< 18714 1726853406.41236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18714 1726853406.41250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 18714 1726853406.41274: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b4860> <<< 18714 1726853406.41277: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850024290> <<< 18714 1726853406.41305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18714 1726853406.41352: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.41364: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.41383: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500b4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b4bc0> <<< 18714 1726853406.41427: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.41446: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.41452: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500b4fb0> <<< 18714 1726853406.41454: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850022db0> <<< 18714 1726853406.41500: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 18714 1726853406.41504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.41680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b56a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b5370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b65a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18714 1726853406.41710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18714 1726853406.41760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 18714 1726853406.41776: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500cc7a0> <<< 18714 1726853406.41806: stdout chunk (state=3): >>>import 'errno' # <<< 18714 1726853406.41833: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.41844: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500cde80> <<< 18714 1726853406.41894: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 18714 1726853406.41896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 18714 1726853406.41933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 18714 1726853406.41945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 18714 1726853406.41957: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500ced20> <<< 18714 1726853406.42027: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500cf320> <<< 18714 1726853406.42037: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500ce270> <<< 18714 1726853406.42143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 18714 1726853406.42161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.42192: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500cfda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500cf4d0> <<< 18714 1726853406.42214: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b6510> <<< 18714 1726853406.42244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 18714 1726853406.42405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe5fbf0> <<< 18714 1726853406.42409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 18714 1726853406.42421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18714 1726853406.42441: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.42590: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe886b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe88410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe886e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18714 1726853406.42742: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.42789: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.42803: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe89010> <<< 18714 1726853406.42956: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.42991: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe899d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe888c0> <<< 18714 1726853406.43019: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe5dd90> <<< 18714 1726853406.43046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18714 1726853406.43086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18714 1726853406.43112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18714 1726853406.43137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 18714 1726853406.43155: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe8ad20> <<< 18714 1726853406.43193: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe88e60> <<< 18714 1726853406.43218: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b6750> <<< 18714 1726853406.43260: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18714 1726853406.43344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.43380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18714 1726853406.43423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18714 1726853406.43466: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484feb7080> <<< 18714 1726853406.43683: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fed7440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18714 1726853406.43727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18714 1726853406.43796: stdout chunk (state=3): >>>import 'ntpath' # <<< 18714 1726853406.43844: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 18714 1726853406.43863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff38260> <<< 18714 1726853406.43881: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18714 1726853406.43913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18714 1726853406.43948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18714 1726853406.44189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff3a9c0> <<< 18714 1726853406.44224: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff38380> <<< 18714 1726853406.44281: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff05250> <<< 18714 1726853406.44318: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 18714 1726853406.44330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd3d340> <<< 18714 1726853406.44360: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fed6240> <<< 18714 1726853406.44369: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe8bc50> <<< 18714 1726853406.44532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18714 1726853406.44567: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f484fed6840> <<< 18714 1726853406.44776: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_h37iyzxd/ansible_stat_payload.zip' <<< 18714 1726853406.44796: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.45186: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18714 1726853406.45210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18714 1726853406.45265: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 18714 1726853406.45284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd92fc0> <<< 18714 1726853406.45301: stdout chunk (state=3): >>>import '_typing' # <<< 18714 1726853406.45727: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd71eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd71070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 18714 1726853406.47838: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.49592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 18714 1726853406.49619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd90e90> <<< 18714 1726853406.49642: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 18714 1726853406.49664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.49703: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18714 1726853406.49887: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fdbe900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbe690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbdfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18714 1726853406.49913: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbe3f0> <<< 18714 1726853406.50095: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd93c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fdbf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fdbf8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18714 1726853406.50142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18714 1726853406.50172: stdout chunk (state=3): >>>import '_locale' # <<< 18714 1726853406.50229: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbfe30> <<< 18714 1726853406.50252: stdout chunk (state=3): >>>import 'pwd' # <<< 18714 1726853406.50395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f711c40> <<< 18714 1726853406.50412: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.50433: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.50586: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f713860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7141d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f715370> <<< 18714 1726853406.50707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 18714 1726853406.50711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18714 1726853406.50791: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f717e30> <<< 18714 1726853406.50835: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.50845: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fd730b0> <<< 18714 1726853406.50874: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7160f0> <<< 18714 1726853406.50901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 18714 1726853406.50946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 18714 1726853406.50981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 18714 1726853406.51002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 18714 1726853406.51022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18714 1726853406.51094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 18714 1726853406.51141: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 18714 1726853406.51145: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71fda0> <<< 18714 1726853406.51165: stdout chunk (state=3): >>>import '_tokenize' # <<< 18714 1726853406.51509: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71e870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71e5d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71eb40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f716600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7679e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f768170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18714 1726853406.51754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f769bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f769970> <<< 18714 1726853406.51768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18714 1726853406.51826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18714 1726853406.51858: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.51973: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f76c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f76a2a0> <<< 18714 1726853406.51981: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18714 1726853406.52011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 18714 1726853406.52111: stdout chunk (state=3): >>>import '_string' # <<< 18714 1726853406.52131: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f76f860> <<< 18714 1726853406.52297: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f76c230> <<< 18714 1726853406.52390: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.52394: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.52619: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f770380> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7709e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f770ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.52622: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7fc0e0> <<< 18714 1726853406.52896: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7fd250> <<< 18714 1726853406.52900: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f772870> <<< 18714 1726853406.52933: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 18714 1726853406.52999: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f773c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7724b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 18714 1726853406.53024: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.53173: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.53294: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.53392: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18714 1726853406.53544: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.53715: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.54693: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.55492: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18714 1726853406.55496: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18714 1726853406.55520: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 18714 1726853406.55532: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18714 1726853406.55579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.55627: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f601490> <<< 18714 1726853406.55785: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f6022a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7fd6d0> <<< 18714 1726853406.55934: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 18714 1726853406.55937: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.55962: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18714 1726853406.56387: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f6023f0> <<< 18714 1726853406.56407: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.57155: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.57851: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58010: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58061: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18714 1726853406.58142: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58156: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58181: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18714 1726853406.58203: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58311: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58435: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18714 1726853406.58558: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.58582: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 18714 1726853406.58623: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18714 1726853406.58646: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.59015: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.59373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18714 1726853406.59458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18714 1726853406.59557: stdout chunk (state=3): >>>import '_ast' # <<< 18714 1726853406.59586: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f603530> <<< 18714 1726853406.59600: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.59694: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.59798: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 18714 1726853406.59828: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18714 1726853406.59845: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 18714 1726853406.59873: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.59994: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 18714 1726853406.60177: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.60204: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18714 1726853406.60224: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.60320: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18714 1726853406.60585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f60e000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f608e90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18714 1726853406.60667: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.60765: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.60804: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.60870: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 18714 1726853406.60895: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18714 1726853406.60910: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18714 1726853406.60942: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18714 1726853406.60976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18714 1726853406.61061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18714 1726853406.61091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18714 1726853406.61116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18714 1726853406.61209: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe0ea20> <<< 18714 1726853406.61268: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe026f0> <<< 18714 1726853406.61376: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f60e210> <<< 18714 1726853406.61433: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f602f90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 18714 1726853406.61455: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.61498: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18714 1726853406.61658: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 18714 1726853406.61662: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18714 1726853406.61855: stdout chunk (state=3): >>># zipimport: zlib available <<< 18714 1726853406.62387: stdout chunk (state=3): >>># zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 18714 1726853406.62830: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 18714 1726853406.62931: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 18714 1726853406.63013: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum <<< 18714 1726853406.63043: stdout chunk (state=3): >>># cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 <<< 18714 1726853406.63112: stdout chunk (state=3): >>># cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib <<< 18714 1726853406.63123: stdout chunk (state=3): >>># cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref<<< 18714 1726853406.63226: stdout chunk (state=3): >>> # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18714 1726853406.63570: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 18714 1726853406.63627: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 18714 1726853406.63755: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 18714 1726853406.63789: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 18714 1726853406.63823: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 18714 1726853406.63848: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 18714 1726853406.63941: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 18714 1726853406.63945: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 18714 1726853406.63988: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18714 1726853406.64170: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18714 1726853406.64186: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18714 1726853406.64294: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 18714 1726853406.64328: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 18714 1726853406.64499: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18714 1726853406.64814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.64874: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 18714 1726853406.64884: stdout chunk (state=3): >>><<< 18714 1726853406.64894: stderr chunk (state=3): >>><<< 18714 1726853406.65174: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48502184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48501e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485021aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ffe9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ffe9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850027e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850027f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485005f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485005ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485003fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485003d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850025010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485007f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485007e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485003e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f485007ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b4860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850024290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500b4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b4bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500b4fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4850022db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b56a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b5370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b65a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500cc7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500cde80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500ced20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500cf320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500ce270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48500cfda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500cf4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b6510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe5fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe886b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe88410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe886e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe89010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fe899d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe888c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe5dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe8ad20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe88e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48500b6750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484feb7080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fed7440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff38260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff3a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff38380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484ff05250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd3d340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fed6240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe8bc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f484fed6840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_h37iyzxd/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd92fc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd71eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd71070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd90e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fdbe900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbe690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbdfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbe3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fd93c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fdbf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fdbf8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fdbfe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f711c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f713860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7141d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f715370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f717e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484fd730b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7160f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71fda0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71e870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71e5d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f71eb40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f716600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7679e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f768170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f769bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f769970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f76c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f76a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f76f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f76c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f770380> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7709e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f770ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7fc0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f7fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f772870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f773c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7724b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f601490> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f6022a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f7fd6d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f6023f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f603530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f484f60e000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f608e90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe0ea20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484fe026f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f60e210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f484f602f90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18714 1726853406.66291: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853406.66294: _low_level_execute_command(): starting 18714 1726853406.66296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853406.1622024-18835-142887961312513/ > /dev/null 2>&1 && sleep 0' 18714 1726853406.66981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853406.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853406.66988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.67006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853406.67009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853406.67152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853406.67232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853406.67236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853406.67285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853406.70353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853406.70357: stdout chunk (state=3): >>><<< 18714 1726853406.70359: stderr chunk (state=3): >>><<< 18714 1726853406.70361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853406.70364: handler run complete 18714 1726853406.70366: attempt loop complete, returning result 18714 1726853406.70368: _execute() done 18714 1726853406.70370: dumping result to json 18714 1726853406.70376: done dumping result, returning 18714 1726853406.70378: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-e784-4f7d-00000000008f] 18714 1726853406.70380: sending task result for task 02083763-bbaf-e784-4f7d-00000000008f 18714 1726853406.70779: done sending task result for task 02083763-bbaf-e784-4f7d-00000000008f 18714 1726853406.70783: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18714 1726853406.70844: no more pending results, returning what we have 18714 1726853406.70847: results queue empty 18714 1726853406.70848: checking for any_errors_fatal 18714 1726853406.70857: done checking for any_errors_fatal 18714 1726853406.70858: checking for max_fail_percentage 18714 1726853406.70860: done checking for max_fail_percentage 18714 1726853406.70860: checking to see if all hosts have failed and the running result is not ok 18714 1726853406.70861: done checking to see if all hosts have failed 18714 1726853406.70862: getting the remaining hosts for this loop 18714 1726853406.70863: done getting the remaining hosts for this loop 18714 1726853406.70867: getting the next task for host managed_node1 18714 1726853406.70875: done getting next task for host managed_node1 18714 1726853406.70878: ^ task is: TASK: Set flag to indicate system is ostree 18714 1726853406.70880: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853406.70884: getting variables 18714 1726853406.70886: in VariableManager get_vars() 18714 1726853406.70916: Calling all_inventory to load vars for managed_node1 18714 1726853406.70919: Calling groups_inventory to load vars for managed_node1 18714 1726853406.70922: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.70932: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.70936: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.70939: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.71507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.72278: done with get_vars() 18714 1726853406.72290: done getting variables 18714 1726853406.72392: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:30:06 -0400 (0:00:00.615) 0:00:03.107 ****** 18714 1726853406.72422: entering _queue_task() for managed_node1/set_fact 18714 1726853406.72424: Creating lock for set_fact 18714 1726853406.72945: worker is 1 (out of 1 available) 18714 1726853406.72959: exiting _queue_task() for managed_node1/set_fact 18714 1726853406.73374: done queuing things up, now waiting for results queue to drain 18714 1726853406.73376: waiting for pending results... 18714 1726853406.73786: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 18714 1726853406.73792: in run() - task 02083763-bbaf-e784-4f7d-000000000090 18714 1726853406.73794: variable 'ansible_search_path' from source: unknown 18714 1726853406.73797: variable 'ansible_search_path' from source: unknown 18714 1726853406.73799: calling self._execute() 18714 1726853406.73993: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.74004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.74077: variable 'omit' from source: magic vars 18714 1726853406.75177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853406.75603: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853406.75655: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853406.75876: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853406.75879: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853406.75935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853406.76028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853406.76138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853406.76173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853406.76545: Evaluated conditional (not __network_is_ostree is defined): True 18714 1726853406.76551: variable 'omit' from source: magic vars 18714 1726853406.76554: variable 'omit' from source: magic vars 18714 1726853406.76742: variable '__ostree_booted_stat' from source: set_fact 18714 1726853406.76814: variable 'omit' from source: magic vars 18714 1726853406.76899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853406.77006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853406.77029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853406.77051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853406.77091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853406.77276: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853406.77280: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.77283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.77425: Set connection var ansible_shell_executable to /bin/sh 18714 1726853406.77438: Set connection var ansible_timeout to 10 18714 1726853406.77447: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853406.77460: Set connection var ansible_connection to ssh 18714 1726853406.77628: Set connection var ansible_shell_type to sh 18714 1726853406.77632: Set connection var ansible_pipelining to False 18714 1726853406.77634: variable 'ansible_shell_executable' from source: unknown 18714 1726853406.77636: variable 'ansible_connection' from source: unknown 18714 1726853406.77639: variable 'ansible_module_compression' from source: unknown 18714 1726853406.77641: variable 'ansible_shell_type' from source: unknown 18714 1726853406.77643: variable 'ansible_shell_executable' from source: unknown 18714 1726853406.77644: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.77646: variable 'ansible_pipelining' from source: unknown 18714 1726853406.77648: variable 'ansible_timeout' from source: unknown 18714 1726853406.77653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.77794: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853406.77861: variable 'omit' from source: magic vars 18714 1726853406.78064: starting attempt loop 18714 1726853406.78067: running the handler 18714 1726853406.78069: handler run complete 18714 1726853406.78072: attempt loop complete, returning result 18714 1726853406.78074: _execute() done 18714 1726853406.78076: dumping result to json 18714 1726853406.78077: done dumping result, returning 18714 1726853406.78079: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-e784-4f7d-000000000090] 18714 1726853406.78080: sending task result for task 02083763-bbaf-e784-4f7d-000000000090 18714 1726853406.78139: done sending task result for task 02083763-bbaf-e784-4f7d-000000000090 18714 1726853406.78141: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18714 1726853406.78218: no more pending results, returning what we have 18714 1726853406.78221: results queue empty 18714 1726853406.78222: checking for any_errors_fatal 18714 1726853406.78228: done checking for any_errors_fatal 18714 1726853406.78229: checking for max_fail_percentage 18714 1726853406.78230: done checking for max_fail_percentage 18714 1726853406.78231: checking to see if all hosts have failed and the running result is not ok 18714 1726853406.78232: done checking to see if all hosts have failed 18714 1726853406.78232: getting the remaining hosts for this loop 18714 1726853406.78234: done getting the remaining hosts for this loop 18714 1726853406.78237: getting the next task for host managed_node1 18714 1726853406.78246: done getting next task for host managed_node1 18714 1726853406.78252: ^ task is: TASK: Fix CentOS6 Base repo 18714 1726853406.78255: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853406.78259: getting variables 18714 1726853406.78261: in VariableManager get_vars() 18714 1726853406.78297: Calling all_inventory to load vars for managed_node1 18714 1726853406.78300: Calling groups_inventory to load vars for managed_node1 18714 1726853406.78304: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.78316: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.78319: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.78329: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.78754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.79323: done with get_vars() 18714 1726853406.79334: done getting variables 18714 1726853406.79662: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:30:06 -0400 (0:00:00.072) 0:00:03.180 ****** 18714 1726853406.79693: entering _queue_task() for managed_node1/copy 18714 1726853406.80376: worker is 1 (out of 1 available) 18714 1726853406.80387: exiting _queue_task() for managed_node1/copy 18714 1726853406.80398: done queuing things up, now waiting for results queue to drain 18714 1726853406.80399: waiting for pending results... 18714 1726853406.80784: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 18714 1726853406.81057: in run() - task 02083763-bbaf-e784-4f7d-000000000092 18714 1726853406.81070: variable 'ansible_search_path' from source: unknown 18714 1726853406.81076: variable 'ansible_search_path' from source: unknown 18714 1726853406.81377: calling self._execute() 18714 1726853406.81381: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.81384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.81395: variable 'omit' from source: magic vars 18714 1726853406.82427: variable 'ansible_distribution' from source: facts 18714 1726853406.82448: Evaluated conditional (ansible_distribution == 'CentOS'): True 18714 1726853406.82687: variable 'ansible_distribution_major_version' from source: facts 18714 1726853406.82702: Evaluated conditional (ansible_distribution_major_version == '6'): False 18714 1726853406.82712: when evaluation is False, skipping this task 18714 1726853406.82718: _execute() done 18714 1726853406.82725: dumping result to json 18714 1726853406.82732: done dumping result, returning 18714 1726853406.82742: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-e784-4f7d-000000000092] 18714 1726853406.82749: sending task result for task 02083763-bbaf-e784-4f7d-000000000092 18714 1726853406.82957: done sending task result for task 02083763-bbaf-e784-4f7d-000000000092 18714 1726853406.82960: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18714 1726853406.83038: no more pending results, returning what we have 18714 1726853406.83041: results queue empty 18714 1726853406.83042: checking for any_errors_fatal 18714 1726853406.83047: done checking for any_errors_fatal 18714 1726853406.83048: checking for max_fail_percentage 18714 1726853406.83052: done checking for max_fail_percentage 18714 1726853406.83052: checking to see if all hosts have failed and the running result is not ok 18714 1726853406.83053: done checking to see if all hosts have failed 18714 1726853406.83054: getting the remaining hosts for this loop 18714 1726853406.83055: done getting the remaining hosts for this loop 18714 1726853406.83058: getting the next task for host managed_node1 18714 1726853406.83065: done getting next task for host managed_node1 18714 1726853406.83067: ^ task is: TASK: Include the task 'enable_epel.yml' 18714 1726853406.83070: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853406.83076: getting variables 18714 1726853406.83077: in VariableManager get_vars() 18714 1726853406.83102: Calling all_inventory to load vars for managed_node1 18714 1726853406.83104: Calling groups_inventory to load vars for managed_node1 18714 1726853406.83107: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.83116: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.83118: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.83121: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.83491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.83821: done with get_vars() 18714 1726853406.83831: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:30:06 -0400 (0:00:00.043) 0:00:03.224 ****** 18714 1726853406.84058: entering _queue_task() for managed_node1/include_tasks 18714 1726853406.84424: worker is 1 (out of 1 available) 18714 1726853406.84437: exiting _queue_task() for managed_node1/include_tasks 18714 1726853406.84462: done queuing things up, now waiting for results queue to drain 18714 1726853406.84464: waiting for pending results... 18714 1726853406.84678: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 18714 1726853406.84781: in run() - task 02083763-bbaf-e784-4f7d-000000000093 18714 1726853406.84866: variable 'ansible_search_path' from source: unknown 18714 1726853406.84875: variable 'ansible_search_path' from source: unknown 18714 1726853406.84879: calling self._execute() 18714 1726853406.84927: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.84937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.84950: variable 'omit' from source: magic vars 18714 1726853406.85436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853406.89523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853406.89584: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853406.89738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853406.89756: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853406.89790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853406.90077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853406.90087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853406.90119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853406.90243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853406.90657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853406.90885: variable '__network_is_ostree' from source: set_fact 18714 1726853406.90913: Evaluated conditional (not __network_is_ostree | d(false)): True 18714 1726853406.90924: _execute() done 18714 1726853406.90936: dumping result to json 18714 1726853406.90944: done dumping result, returning 18714 1726853406.91082: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-e784-4f7d-000000000093] 18714 1726853406.91086: sending task result for task 02083763-bbaf-e784-4f7d-000000000093 18714 1726853406.91433: done sending task result for task 02083763-bbaf-e784-4f7d-000000000093 18714 1726853406.91439: WORKER PROCESS EXITING 18714 1726853406.91469: no more pending results, returning what we have 18714 1726853406.91476: in VariableManager get_vars() 18714 1726853406.91510: Calling all_inventory to load vars for managed_node1 18714 1726853406.91512: Calling groups_inventory to load vars for managed_node1 18714 1726853406.91515: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.91526: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.91529: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.91531: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.92124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.92524: done with get_vars() 18714 1726853406.92534: variable 'ansible_search_path' from source: unknown 18714 1726853406.92536: variable 'ansible_search_path' from source: unknown 18714 1726853406.92580: we have included files to process 18714 1726853406.92581: generating all_blocks data 18714 1726853406.92583: done generating all_blocks data 18714 1726853406.92588: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18714 1726853406.92590: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18714 1726853406.92592: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18714 1726853406.93934: done processing included file 18714 1726853406.93937: iterating over new_blocks loaded from include file 18714 1726853406.93938: in VariableManager get_vars() 18714 1726853406.93952: done with get_vars() 18714 1726853406.93953: filtering new block on tags 18714 1726853406.93974: done filtering new block on tags 18714 1726853406.93977: in VariableManager get_vars() 18714 1726853406.93987: done with get_vars() 18714 1726853406.93988: filtering new block on tags 18714 1726853406.93998: done filtering new block on tags 18714 1726853406.93999: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 18714 1726853406.94004: extending task lists for all hosts with included blocks 18714 1726853406.94139: done extending task lists 18714 1726853406.94141: done processing included files 18714 1726853406.94142: results queue empty 18714 1726853406.94142: checking for any_errors_fatal 18714 1726853406.94145: done checking for any_errors_fatal 18714 1726853406.94146: checking for max_fail_percentage 18714 1726853406.94147: done checking for max_fail_percentage 18714 1726853406.94148: checking to see if all hosts have failed and the running result is not ok 18714 1726853406.94148: done checking to see if all hosts have failed 18714 1726853406.94149: getting the remaining hosts for this loop 18714 1726853406.94150: done getting the remaining hosts for this loop 18714 1726853406.94152: getting the next task for host managed_node1 18714 1726853406.94156: done getting next task for host managed_node1 18714 1726853406.94158: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18714 1726853406.94161: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853406.94163: getting variables 18714 1726853406.94164: in VariableManager get_vars() 18714 1726853406.94176: Calling all_inventory to load vars for managed_node1 18714 1726853406.94178: Calling groups_inventory to load vars for managed_node1 18714 1726853406.94181: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.94186: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.94200: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.94203: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.94368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.94577: done with get_vars() 18714 1726853406.94586: done getting variables 18714 1726853406.94667: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18714 1726853406.94898: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:30:06 -0400 (0:00:00.108) 0:00:03.333 ****** 18714 1726853406.94964: entering _queue_task() for managed_node1/command 18714 1726853406.94966: Creating lock for command 18714 1726853406.95347: worker is 1 (out of 1 available) 18714 1726853406.95364: exiting _queue_task() for managed_node1/command 18714 1726853406.95379: done queuing things up, now waiting for results queue to drain 18714 1726853406.95380: waiting for pending results... 18714 1726853406.95817: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 18714 1726853406.95830: in run() - task 02083763-bbaf-e784-4f7d-0000000000ad 18714 1726853406.95851: variable 'ansible_search_path' from source: unknown 18714 1726853406.95859: variable 'ansible_search_path' from source: unknown 18714 1726853406.95903: calling self._execute() 18714 1726853406.95999: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.96010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.96032: variable 'omit' from source: magic vars 18714 1726853406.96467: variable 'ansible_distribution' from source: facts 18714 1726853406.96479: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18714 1726853406.96616: variable 'ansible_distribution_major_version' from source: facts 18714 1726853406.96628: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18714 1726853406.96676: when evaluation is False, skipping this task 18714 1726853406.96684: _execute() done 18714 1726853406.96693: dumping result to json 18714 1726853406.96696: done dumping result, returning 18714 1726853406.96703: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-e784-4f7d-0000000000ad] 18714 1726853406.96708: sending task result for task 02083763-bbaf-e784-4f7d-0000000000ad skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18714 1726853406.96982: no more pending results, returning what we have 18714 1726853406.96986: results queue empty 18714 1726853406.96987: checking for any_errors_fatal 18714 1726853406.96989: done checking for any_errors_fatal 18714 1726853406.96989: checking for max_fail_percentage 18714 1726853406.96991: done checking for max_fail_percentage 18714 1726853406.96992: checking to see if all hosts have failed and the running result is not ok 18714 1726853406.96992: done checking to see if all hosts have failed 18714 1726853406.96993: getting the remaining hosts for this loop 18714 1726853406.96996: done getting the remaining hosts for this loop 18714 1726853406.96999: getting the next task for host managed_node1 18714 1726853406.97007: done getting next task for host managed_node1 18714 1726853406.97009: ^ task is: TASK: Install yum-utils package 18714 1726853406.97024: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853406.97030: getting variables 18714 1726853406.97032: in VariableManager get_vars() 18714 1726853406.97064: Calling all_inventory to load vars for managed_node1 18714 1726853406.97067: Calling groups_inventory to load vars for managed_node1 18714 1726853406.97074: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853406.97092: Calling all_plugins_play to load vars for managed_node1 18714 1726853406.97100: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853406.97104: Calling groups_plugins_play to load vars for managed_node1 18714 1726853406.97614: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000ad 18714 1726853406.97617: WORKER PROCESS EXITING 18714 1726853406.97839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853406.98258: done with get_vars() 18714 1726853406.98381: done getting variables 18714 1726853406.98485: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:30:06 -0400 (0:00:00.036) 0:00:03.369 ****** 18714 1726853406.98631: entering _queue_task() for managed_node1/package 18714 1726853406.98633: Creating lock for package 18714 1726853406.99179: worker is 1 (out of 1 available) 18714 1726853406.99190: exiting _queue_task() for managed_node1/package 18714 1726853406.99200: done queuing things up, now waiting for results queue to drain 18714 1726853406.99201: waiting for pending results... 18714 1726853406.99353: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 18714 1726853406.99472: in run() - task 02083763-bbaf-e784-4f7d-0000000000ae 18714 1726853406.99495: variable 'ansible_search_path' from source: unknown 18714 1726853406.99502: variable 'ansible_search_path' from source: unknown 18714 1726853406.99546: calling self._execute() 18714 1726853406.99625: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853406.99635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853406.99679: variable 'omit' from source: magic vars 18714 1726853407.00061: variable 'ansible_distribution' from source: facts 18714 1726853407.00083: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18714 1726853407.00225: variable 'ansible_distribution_major_version' from source: facts 18714 1726853407.00240: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18714 1726853407.00248: when evaluation is False, skipping this task 18714 1726853407.00294: _execute() done 18714 1726853407.00297: dumping result to json 18714 1726853407.00299: done dumping result, returning 18714 1726853407.00301: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-e784-4f7d-0000000000ae] 18714 1726853407.00303: sending task result for task 02083763-bbaf-e784-4f7d-0000000000ae skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18714 1726853407.00502: no more pending results, returning what we have 18714 1726853407.00565: results queue empty 18714 1726853407.00566: checking for any_errors_fatal 18714 1726853407.00575: done checking for any_errors_fatal 18714 1726853407.00576: checking for max_fail_percentage 18714 1726853407.00578: done checking for max_fail_percentage 18714 1726853407.00578: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.00579: done checking to see if all hosts have failed 18714 1726853407.00580: getting the remaining hosts for this loop 18714 1726853407.00581: done getting the remaining hosts for this loop 18714 1726853407.00584: getting the next task for host managed_node1 18714 1726853407.00592: done getting next task for host managed_node1 18714 1726853407.00594: ^ task is: TASK: Enable EPEL 7 18714 1726853407.00599: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.00602: getting variables 18714 1726853407.00604: in VariableManager get_vars() 18714 1726853407.00641: Calling all_inventory to load vars for managed_node1 18714 1726853407.00644: Calling groups_inventory to load vars for managed_node1 18714 1726853407.00647: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.00662: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.00665: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.00668: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.00778: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000ae 18714 1726853407.00781: WORKER PROCESS EXITING 18714 1726853407.01133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.01360: done with get_vars() 18714 1726853407.01369: done getting variables 18714 1726853407.01435: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:30:07 -0400 (0:00:00.028) 0:00:03.398 ****** 18714 1726853407.01467: entering _queue_task() for managed_node1/command 18714 1726853407.01951: worker is 1 (out of 1 available) 18714 1726853407.01959: exiting _queue_task() for managed_node1/command 18714 1726853407.01969: done queuing things up, now waiting for results queue to drain 18714 1726853407.01970: waiting for pending results... 18714 1726853407.01996: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 18714 1726853407.02145: in run() - task 02083763-bbaf-e784-4f7d-0000000000af 18714 1726853407.02166: variable 'ansible_search_path' from source: unknown 18714 1726853407.02177: variable 'ansible_search_path' from source: unknown 18714 1726853407.02266: calling self._execute() 18714 1726853407.02365: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.02376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.02389: variable 'omit' from source: magic vars 18714 1726853407.02798: variable 'ansible_distribution' from source: facts 18714 1726853407.02815: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18714 1726853407.02974: variable 'ansible_distribution_major_version' from source: facts 18714 1726853407.02985: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18714 1726853407.02993: when evaluation is False, skipping this task 18714 1726853407.03000: _execute() done 18714 1726853407.03007: dumping result to json 18714 1726853407.03015: done dumping result, returning 18714 1726853407.03025: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-e784-4f7d-0000000000af] 18714 1726853407.03035: sending task result for task 02083763-bbaf-e784-4f7d-0000000000af skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18714 1726853407.03227: no more pending results, returning what we have 18714 1726853407.03231: results queue empty 18714 1726853407.03232: checking for any_errors_fatal 18714 1726853407.03240: done checking for any_errors_fatal 18714 1726853407.03241: checking for max_fail_percentage 18714 1726853407.03242: done checking for max_fail_percentage 18714 1726853407.03244: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.03244: done checking to see if all hosts have failed 18714 1726853407.03245: getting the remaining hosts for this loop 18714 1726853407.03246: done getting the remaining hosts for this loop 18714 1726853407.03252: getting the next task for host managed_node1 18714 1726853407.03261: done getting next task for host managed_node1 18714 1726853407.03264: ^ task is: TASK: Enable EPEL 8 18714 1726853407.03269: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.03275: getting variables 18714 1726853407.03277: in VariableManager get_vars() 18714 1726853407.03539: Calling all_inventory to load vars for managed_node1 18714 1726853407.03542: Calling groups_inventory to load vars for managed_node1 18714 1726853407.03545: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.03554: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000af 18714 1726853407.03557: WORKER PROCESS EXITING 18714 1726853407.03567: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.03570: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.03575: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.03796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.04003: done with get_vars() 18714 1726853407.04012: done getting variables 18714 1726853407.04083: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:30:07 -0400 (0:00:00.026) 0:00:03.424 ****** 18714 1726853407.04108: entering _queue_task() for managed_node1/command 18714 1726853407.04367: worker is 1 (out of 1 available) 18714 1726853407.04581: exiting _queue_task() for managed_node1/command 18714 1726853407.04590: done queuing things up, now waiting for results queue to drain 18714 1726853407.04591: waiting for pending results... 18714 1726853407.04629: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 18714 1726853407.04752: in run() - task 02083763-bbaf-e784-4f7d-0000000000b0 18714 1726853407.04776: variable 'ansible_search_path' from source: unknown 18714 1726853407.04784: variable 'ansible_search_path' from source: unknown 18714 1726853407.04827: calling self._execute() 18714 1726853407.04911: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.04929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.04943: variable 'omit' from source: magic vars 18714 1726853407.05348: variable 'ansible_distribution' from source: facts 18714 1726853407.05377: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18714 1726853407.05536: variable 'ansible_distribution_major_version' from source: facts 18714 1726853407.05547: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18714 1726853407.05583: when evaluation is False, skipping this task 18714 1726853407.05587: _execute() done 18714 1726853407.05590: dumping result to json 18714 1726853407.05592: done dumping result, returning 18714 1726853407.05594: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-e784-4f7d-0000000000b0] 18714 1726853407.05629: sending task result for task 02083763-bbaf-e784-4f7d-0000000000b0 18714 1726853407.05768: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000b0 18714 1726853407.05773: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18714 1726853407.05830: no more pending results, returning what we have 18714 1726853407.05833: results queue empty 18714 1726853407.05834: checking for any_errors_fatal 18714 1726853407.05839: done checking for any_errors_fatal 18714 1726853407.05840: checking for max_fail_percentage 18714 1726853407.05842: done checking for max_fail_percentage 18714 1726853407.05843: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.05843: done checking to see if all hosts have failed 18714 1726853407.05844: getting the remaining hosts for this loop 18714 1726853407.05845: done getting the remaining hosts for this loop 18714 1726853407.05851: getting the next task for host managed_node1 18714 1726853407.05860: done getting next task for host managed_node1 18714 1726853407.05863: ^ task is: TASK: Enable EPEL 6 18714 1726853407.05867: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.05872: getting variables 18714 1726853407.05874: in VariableManager get_vars() 18714 1726853407.06084: Calling all_inventory to load vars for managed_node1 18714 1726853407.06087: Calling groups_inventory to load vars for managed_node1 18714 1726853407.06090: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.06100: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.06103: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.06105: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.06353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.06567: done with get_vars() 18714 1726853407.06583: done getting variables 18714 1726853407.06648: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:30:07 -0400 (0:00:00.025) 0:00:03.450 ****** 18714 1726853407.06681: entering _queue_task() for managed_node1/copy 18714 1726853407.07067: worker is 1 (out of 1 available) 18714 1726853407.07081: exiting _queue_task() for managed_node1/copy 18714 1726853407.07091: done queuing things up, now waiting for results queue to drain 18714 1726853407.07092: waiting for pending results... 18714 1726853407.07325: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 18714 1726853407.07362: in run() - task 02083763-bbaf-e784-4f7d-0000000000b2 18714 1726853407.07389: variable 'ansible_search_path' from source: unknown 18714 1726853407.07397: variable 'ansible_search_path' from source: unknown 18714 1726853407.07438: calling self._execute() 18714 1726853407.07533: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.07537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.07576: variable 'omit' from source: magic vars 18714 1726853407.07952: variable 'ansible_distribution' from source: facts 18714 1726853407.07974: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18714 1726853407.08108: variable 'ansible_distribution_major_version' from source: facts 18714 1726853407.08142: Evaluated conditional (ansible_distribution_major_version == '6'): False 18714 1726853407.08145: when evaluation is False, skipping this task 18714 1726853407.08148: _execute() done 18714 1726853407.08153: dumping result to json 18714 1726853407.08155: done dumping result, returning 18714 1726853407.08185: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-e784-4f7d-0000000000b2] 18714 1726853407.08188: sending task result for task 02083763-bbaf-e784-4f7d-0000000000b2 18714 1726853407.08328: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000b2 18714 1726853407.08331: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18714 1726853407.08503: no more pending results, returning what we have 18714 1726853407.08508: results queue empty 18714 1726853407.08508: checking for any_errors_fatal 18714 1726853407.08514: done checking for any_errors_fatal 18714 1726853407.08515: checking for max_fail_percentage 18714 1726853407.08517: done checking for max_fail_percentage 18714 1726853407.08518: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.08519: done checking to see if all hosts have failed 18714 1726853407.08520: getting the remaining hosts for this loop 18714 1726853407.08521: done getting the remaining hosts for this loop 18714 1726853407.08525: getting the next task for host managed_node1 18714 1726853407.08535: done getting next task for host managed_node1 18714 1726853407.08538: ^ task is: TASK: Set network provider to 'nm' 18714 1726853407.08540: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.08544: getting variables 18714 1726853407.08546: in VariableManager get_vars() 18714 1726853407.08694: Calling all_inventory to load vars for managed_node1 18714 1726853407.08698: Calling groups_inventory to load vars for managed_node1 18714 1726853407.08701: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.08712: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.08715: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.08718: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.09198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.09397: done with get_vars() 18714 1726853407.09406: done getting variables 18714 1726853407.09479: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 13:30:07 -0400 (0:00:00.028) 0:00:03.478 ****** 18714 1726853407.09506: entering _queue_task() for managed_node1/set_fact 18714 1726853407.09889: worker is 1 (out of 1 available) 18714 1726853407.09899: exiting _queue_task() for managed_node1/set_fact 18714 1726853407.09912: done queuing things up, now waiting for results queue to drain 18714 1726853407.09913: waiting for pending results... 18714 1726853407.10154: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 18714 1726853407.10196: in run() - task 02083763-bbaf-e784-4f7d-000000000007 18714 1726853407.10201: variable 'ansible_search_path' from source: unknown 18714 1726853407.10305: calling self._execute() 18714 1726853407.10333: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.10345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.10374: variable 'omit' from source: magic vars 18714 1726853407.10483: variable 'omit' from source: magic vars 18714 1726853407.10518: variable 'omit' from source: magic vars 18714 1726853407.10555: variable 'omit' from source: magic vars 18714 1726853407.10604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853407.10647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853407.10737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853407.10740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853407.10743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853407.10745: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853407.10753: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.10760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.10864: Set connection var ansible_shell_executable to /bin/sh 18714 1726853407.10876: Set connection var ansible_timeout to 10 18714 1726853407.10885: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853407.10902: Set connection var ansible_connection to ssh 18714 1726853407.10911: Set connection var ansible_shell_type to sh 18714 1726853407.10920: Set connection var ansible_pipelining to False 18714 1726853407.10944: variable 'ansible_shell_executable' from source: unknown 18714 1726853407.10957: variable 'ansible_connection' from source: unknown 18714 1726853407.11000: variable 'ansible_module_compression' from source: unknown 18714 1726853407.11006: variable 'ansible_shell_type' from source: unknown 18714 1726853407.11008: variable 'ansible_shell_executable' from source: unknown 18714 1726853407.11010: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.11012: variable 'ansible_pipelining' from source: unknown 18714 1726853407.11014: variable 'ansible_timeout' from source: unknown 18714 1726853407.11016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.11152: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853407.11175: variable 'omit' from source: magic vars 18714 1726853407.11216: starting attempt loop 18714 1726853407.11224: running the handler 18714 1726853407.11227: handler run complete 18714 1726853407.11229: attempt loop complete, returning result 18714 1726853407.11231: _execute() done 18714 1726853407.11233: dumping result to json 18714 1726853407.11240: done dumping result, returning 18714 1726853407.11253: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [02083763-bbaf-e784-4f7d-000000000007] 18714 1726853407.11278: sending task result for task 02083763-bbaf-e784-4f7d-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 18714 1726853407.11499: no more pending results, returning what we have 18714 1726853407.11502: results queue empty 18714 1726853407.11503: checking for any_errors_fatal 18714 1726853407.11511: done checking for any_errors_fatal 18714 1726853407.11512: checking for max_fail_percentage 18714 1726853407.11514: done checking for max_fail_percentage 18714 1726853407.11515: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.11515: done checking to see if all hosts have failed 18714 1726853407.11516: getting the remaining hosts for this loop 18714 1726853407.11517: done getting the remaining hosts for this loop 18714 1726853407.11521: getting the next task for host managed_node1 18714 1726853407.11528: done getting next task for host managed_node1 18714 1726853407.11530: ^ task is: TASK: meta (flush_handlers) 18714 1726853407.11532: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.11537: getting variables 18714 1726853407.11538: in VariableManager get_vars() 18714 1726853407.11631: Calling all_inventory to load vars for managed_node1 18714 1726853407.11634: Calling groups_inventory to load vars for managed_node1 18714 1726853407.11637: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.11648: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.11717: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.11722: done sending task result for task 02083763-bbaf-e784-4f7d-000000000007 18714 1726853407.11725: WORKER PROCESS EXITING 18714 1726853407.11729: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.11980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.12182: done with get_vars() 18714 1726853407.12191: done getting variables 18714 1726853407.12258: in VariableManager get_vars() 18714 1726853407.12266: Calling all_inventory to load vars for managed_node1 18714 1726853407.12268: Calling groups_inventory to load vars for managed_node1 18714 1726853407.12279: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.12283: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.12285: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.12288: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.12456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.12657: done with get_vars() 18714 1726853407.12672: done queuing things up, now waiting for results queue to drain 18714 1726853407.12674: results queue empty 18714 1726853407.12675: checking for any_errors_fatal 18714 1726853407.12677: done checking for any_errors_fatal 18714 1726853407.12678: checking for max_fail_percentage 18714 1726853407.12679: done checking for max_fail_percentage 18714 1726853407.12679: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.12680: done checking to see if all hosts have failed 18714 1726853407.12681: getting the remaining hosts for this loop 18714 1726853407.12681: done getting the remaining hosts for this loop 18714 1726853407.12684: getting the next task for host managed_node1 18714 1726853407.12687: done getting next task for host managed_node1 18714 1726853407.12689: ^ task is: TASK: meta (flush_handlers) 18714 1726853407.12690: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.12697: getting variables 18714 1726853407.12698: in VariableManager get_vars() 18714 1726853407.12713: Calling all_inventory to load vars for managed_node1 18714 1726853407.12715: Calling groups_inventory to load vars for managed_node1 18714 1726853407.12717: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.12721: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.12724: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.12726: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.12867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.13066: done with get_vars() 18714 1726853407.13075: done getting variables 18714 1726853407.13118: in VariableManager get_vars() 18714 1726853407.13126: Calling all_inventory to load vars for managed_node1 18714 1726853407.13128: Calling groups_inventory to load vars for managed_node1 18714 1726853407.13130: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.13134: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.13137: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.13146: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.13288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.13500: done with get_vars() 18714 1726853407.13510: done queuing things up, now waiting for results queue to drain 18714 1726853407.13512: results queue empty 18714 1726853407.13513: checking for any_errors_fatal 18714 1726853407.13514: done checking for any_errors_fatal 18714 1726853407.13514: checking for max_fail_percentage 18714 1726853407.13515: done checking for max_fail_percentage 18714 1726853407.13516: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.13517: done checking to see if all hosts have failed 18714 1726853407.13517: getting the remaining hosts for this loop 18714 1726853407.13518: done getting the remaining hosts for this loop 18714 1726853407.13520: getting the next task for host managed_node1 18714 1726853407.13523: done getting next task for host managed_node1 18714 1726853407.13524: ^ task is: None 18714 1726853407.13525: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.13526: done queuing things up, now waiting for results queue to drain 18714 1726853407.13527: results queue empty 18714 1726853407.13527: checking for any_errors_fatal 18714 1726853407.13528: done checking for any_errors_fatal 18714 1726853407.13528: checking for max_fail_percentage 18714 1726853407.13529: done checking for max_fail_percentage 18714 1726853407.13530: checking to see if all hosts have failed and the running result is not ok 18714 1726853407.13531: done checking to see if all hosts have failed 18714 1726853407.13532: getting the next task for host managed_node1 18714 1726853407.13534: done getting next task for host managed_node1 18714 1726853407.13535: ^ task is: None 18714 1726853407.13536: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.13596: in VariableManager get_vars() 18714 1726853407.13610: done with get_vars() 18714 1726853407.13616: in VariableManager get_vars() 18714 1726853407.13624: done with get_vars() 18714 1726853407.13628: variable 'omit' from source: magic vars 18714 1726853407.13662: in VariableManager get_vars() 18714 1726853407.13673: done with get_vars() 18714 1726853407.13699: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18714 1726853407.13890: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853407.13923: getting the remaining hosts for this loop 18714 1726853407.13924: done getting the remaining hosts for this loop 18714 1726853407.13927: getting the next task for host managed_node1 18714 1726853407.13929: done getting next task for host managed_node1 18714 1726853407.13931: ^ task is: TASK: Gathering Facts 18714 1726853407.13932: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853407.13934: getting variables 18714 1726853407.13935: in VariableManager get_vars() 18714 1726853407.13942: Calling all_inventory to load vars for managed_node1 18714 1726853407.13944: Calling groups_inventory to load vars for managed_node1 18714 1726853407.13946: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853407.13953: Calling all_plugins_play to load vars for managed_node1 18714 1726853407.13966: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853407.13969: Calling groups_plugins_play to load vars for managed_node1 18714 1726853407.14102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853407.14278: done with get_vars() 18714 1726853407.14286: done getting variables 18714 1726853407.14322: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 13:30:07 -0400 (0:00:00.048) 0:00:03.526 ****** 18714 1726853407.14355: entering _queue_task() for managed_node1/gather_facts 18714 1726853407.14620: worker is 1 (out of 1 available) 18714 1726853407.14631: exiting _queue_task() for managed_node1/gather_facts 18714 1726853407.14641: done queuing things up, now waiting for results queue to drain 18714 1726853407.14642: waiting for pending results... 18714 1726853407.14878: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853407.14977: in run() - task 02083763-bbaf-e784-4f7d-0000000000d8 18714 1726853407.15076: variable 'ansible_search_path' from source: unknown 18714 1726853407.15081: calling self._execute() 18714 1726853407.15124: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.15134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.15146: variable 'omit' from source: magic vars 18714 1726853407.15534: variable 'ansible_distribution_major_version' from source: facts 18714 1726853407.15558: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853407.15568: variable 'omit' from source: magic vars 18714 1726853407.15599: variable 'omit' from source: magic vars 18714 1726853407.15643: variable 'omit' from source: magic vars 18714 1726853407.15695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853407.15736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853407.15772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853407.15795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853407.15811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853407.15874: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853407.15877: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.15880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.15965: Set connection var ansible_shell_executable to /bin/sh 18714 1726853407.15985: Set connection var ansible_timeout to 10 18714 1726853407.15994: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853407.16075: Set connection var ansible_connection to ssh 18714 1726853407.16078: Set connection var ansible_shell_type to sh 18714 1726853407.16088: Set connection var ansible_pipelining to False 18714 1726853407.16090: variable 'ansible_shell_executable' from source: unknown 18714 1726853407.16092: variable 'ansible_connection' from source: unknown 18714 1726853407.16094: variable 'ansible_module_compression' from source: unknown 18714 1726853407.16096: variable 'ansible_shell_type' from source: unknown 18714 1726853407.16098: variable 'ansible_shell_executable' from source: unknown 18714 1726853407.16100: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853407.16102: variable 'ansible_pipelining' from source: unknown 18714 1726853407.16104: variable 'ansible_timeout' from source: unknown 18714 1726853407.16105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853407.16276: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853407.16304: variable 'omit' from source: magic vars 18714 1726853407.16307: starting attempt loop 18714 1726853407.16309: running the handler 18714 1726853407.16412: variable 'ansible_facts' from source: unknown 18714 1726853407.16415: _low_level_execute_command(): starting 18714 1726853407.16417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853407.17132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853407.17145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853407.17196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853407.17288: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853407.17311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853407.17335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853407.17393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853407.19889: stdout chunk (state=3): >>>/root <<< 18714 1726853407.19963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853407.19978: stdout chunk (state=3): >>><<< 18714 1726853407.19992: stderr chunk (state=3): >>><<< 18714 1726853407.20177: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853407.20180: _low_level_execute_command(): starting 18714 1726853407.20183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300 `" && echo ansible-tmp-1726853407.201039-18879-191723479148300="` echo /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300 `" ) && sleep 0' 18714 1726853407.21008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853407.21022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853407.21038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853407.21058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853407.21078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853407.21188: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853407.21200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853407.21218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853407.21294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853407.24046: stdout chunk (state=3): >>>ansible-tmp-1726853407.201039-18879-191723479148300=/root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300 <<< 18714 1726853407.24228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853407.24303: stderr chunk (state=3): >>><<< 18714 1726853407.24306: stdout chunk (state=3): >>><<< 18714 1726853407.24542: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853407.201039-18879-191723479148300=/root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853407.24545: variable 'ansible_module_compression' from source: unknown 18714 1726853407.24548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853407.24609: variable 'ansible_facts' from source: unknown 18714 1726853407.25110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py 18714 1726853407.25708: Sending initial data 18714 1726853407.25717: Sent initial data (153 bytes) 18714 1726853407.26598: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853407.26683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853407.26885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853407.26969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853407.29277: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853407.29313: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853407.29362: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpkkgexztv /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py <<< 18714 1726853407.29446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py" <<< 18714 1726853407.29476: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18714 1726853407.29491: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpkkgexztv" to remote "/root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py" <<< 18714 1726853407.31916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853407.32031: stderr chunk (state=3): >>><<< 18714 1726853407.32034: stdout chunk (state=3): >>><<< 18714 1726853407.32036: done transferring module to remote 18714 1726853407.32038: _low_level_execute_command(): starting 18714 1726853407.32040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/ /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py && sleep 0' 18714 1726853407.33134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853407.33345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853407.33479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853407.33903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853407.35989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853407.36100: stderr chunk (state=3): >>><<< 18714 1726853407.36113: stdout chunk (state=3): >>><<< 18714 1726853407.36231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853407.36235: _low_level_execute_command(): starting 18714 1726853407.36238: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/AnsiballZ_setup.py && sleep 0' 18714 1726853407.37005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853407.37055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853407.37128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853407.37167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853407.37194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853407.37238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18714 1726853408.13078: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.50634765625, "5m": 0.373046875, "15m": 0.17431640625}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", <<< 18714 1726853408.13083: stdout chunk (state=3): >>>"rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-38<<< 18714 1726853408.13092: stdout chunk (state=3): >>>37607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 574, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794705408, "block_size": 4096, "block_total": 65519099, "block_available": 63914723, "block_used": 1604376, "inode_total": 131070960, "inode_available": 131029065, "inode_used": 41895, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QU<<< 18714 1726853408.13124: stdout chunk (state=3): >>>WoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "08", "epoch": "1726853408", "epoch_int": "1726853408", "date": "2024-09-20", "time": "13:30:08", "iso8601_micro": "2024-09-20T17:30:08.127501Z", "iso8601": "2024-09-20T17:30:08Z", "iso8601_basic": "20240920T133008127501", "iso8601_basic_short": "20240920T133008", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853408.15005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853408.15059: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 18714 1726853408.15063: stdout chunk (state=3): >>><<< 18714 1726853408.15073: stderr chunk (state=3): >>><<< 18714 1726853408.15279: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.50634765625, "5m": 0.373046875, "15m": 0.17431640625}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 574, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794705408, "block_size": 4096, "block_total": 65519099, "block_available": 63914723, "block_used": 1604376, "inode_total": 131070960, "inode_available": 131029065, "inode_used": 41895, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "08", "epoch": "1726853408", "epoch_int": "1726853408", "date": "2024-09-20", "time": "13:30:08", "iso8601_micro": "2024-09-20T17:30:08.127501Z", "iso8601": "2024-09-20T17:30:08Z", "iso8601_basic": "20240920T133008127501", "iso8601_basic_short": "20240920T133008", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853408.15437: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853408.15467: _low_level_execute_command(): starting 18714 1726853408.15482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853407.201039-18879-191723479148300/ > /dev/null 2>&1 && sleep 0' 18714 1726853408.16130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853408.16145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853408.16163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853408.16289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853408.16293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853408.16313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853408.16390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853408.18259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853408.18284: stdout chunk (state=3): >>><<< 18714 1726853408.18287: stderr chunk (state=3): >>><<< 18714 1726853408.18303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853408.18476: handler run complete 18714 1726853408.18479: variable 'ansible_facts' from source: unknown 18714 1726853408.18539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.18852: variable 'ansible_facts' from source: unknown 18714 1726853408.18952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.19112: attempt loop complete, returning result 18714 1726853408.19122: _execute() done 18714 1726853408.19131: dumping result to json 18714 1726853408.19167: done dumping result, returning 18714 1726853408.19184: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-0000000000d8] 18714 1726853408.19192: sending task result for task 02083763-bbaf-e784-4f7d-0000000000d8 18714 1726853408.19878: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000d8 18714 1726853408.19882: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853408.20126: no more pending results, returning what we have 18714 1726853408.20129: results queue empty 18714 1726853408.20130: checking for any_errors_fatal 18714 1726853408.20131: done checking for any_errors_fatal 18714 1726853408.20131: checking for max_fail_percentage 18714 1726853408.20133: done checking for max_fail_percentage 18714 1726853408.20134: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.20134: done checking to see if all hosts have failed 18714 1726853408.20135: getting the remaining hosts for this loop 18714 1726853408.20136: done getting the remaining hosts for this loop 18714 1726853408.20139: getting the next task for host managed_node1 18714 1726853408.20144: done getting next task for host managed_node1 18714 1726853408.20146: ^ task is: TASK: meta (flush_handlers) 18714 1726853408.20147: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.20153: getting variables 18714 1726853408.20154: in VariableManager get_vars() 18714 1726853408.20176: Calling all_inventory to load vars for managed_node1 18714 1726853408.20178: Calling groups_inventory to load vars for managed_node1 18714 1726853408.20181: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.20191: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.20194: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.20196: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.20384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.20579: done with get_vars() 18714 1726853408.20587: done getting variables 18714 1726853408.20647: in VariableManager get_vars() 18714 1726853408.20657: Calling all_inventory to load vars for managed_node1 18714 1726853408.20659: Calling groups_inventory to load vars for managed_node1 18714 1726853408.20661: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.20665: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.20667: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.20669: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.20797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.20986: done with get_vars() 18714 1726853408.20997: done queuing things up, now waiting for results queue to drain 18714 1726853408.20998: results queue empty 18714 1726853408.20999: checking for any_errors_fatal 18714 1726853408.21002: done checking for any_errors_fatal 18714 1726853408.21002: checking for max_fail_percentage 18714 1726853408.21003: done checking for max_fail_percentage 18714 1726853408.21008: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.21009: done checking to see if all hosts have failed 18714 1726853408.21010: getting the remaining hosts for this loop 18714 1726853408.21010: done getting the remaining hosts for this loop 18714 1726853408.21013: getting the next task for host managed_node1 18714 1726853408.21016: done getting next task for host managed_node1 18714 1726853408.21018: ^ task is: TASK: Show inside ethernet tests 18714 1726853408.21019: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.21020: getting variables 18714 1726853408.21021: in VariableManager get_vars() 18714 1726853408.21028: Calling all_inventory to load vars for managed_node1 18714 1726853408.21030: Calling groups_inventory to load vars for managed_node1 18714 1726853408.21032: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.21036: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.21038: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.21040: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.21169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.21367: done with get_vars() 18714 1726853408.21376: done getting variables 18714 1726853408.21446: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 13:30:08 -0400 (0:00:01.071) 0:00:04.598 ****** 18714 1726853408.21473: entering _queue_task() for managed_node1/debug 18714 1726853408.21475: Creating lock for debug 18714 1726853408.21735: worker is 1 (out of 1 available) 18714 1726853408.21747: exiting _queue_task() for managed_node1/debug 18714 1726853408.21759: done queuing things up, now waiting for results queue to drain 18714 1726853408.21759: waiting for pending results... 18714 1726853408.21988: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 18714 1726853408.22074: in run() - task 02083763-bbaf-e784-4f7d-00000000000b 18714 1726853408.22093: variable 'ansible_search_path' from source: unknown 18714 1726853408.22134: calling self._execute() 18714 1726853408.22213: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.22222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.22234: variable 'omit' from source: magic vars 18714 1726853408.22845: variable 'ansible_distribution_major_version' from source: facts 18714 1726853408.23010: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853408.23012: variable 'omit' from source: magic vars 18714 1726853408.23014: variable 'omit' from source: magic vars 18714 1726853408.23016: variable 'omit' from source: magic vars 18714 1726853408.23127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853408.23166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853408.23246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853408.23268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853408.23286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853408.23364: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853408.23374: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.23382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.23605: Set connection var ansible_shell_executable to /bin/sh 18714 1726853408.23618: Set connection var ansible_timeout to 10 18714 1726853408.23629: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853408.23641: Set connection var ansible_connection to ssh 18714 1726853408.23650: Set connection var ansible_shell_type to sh 18714 1726853408.23660: Set connection var ansible_pipelining to False 18714 1726853408.23688: variable 'ansible_shell_executable' from source: unknown 18714 1726853408.23702: variable 'ansible_connection' from source: unknown 18714 1726853408.23714: variable 'ansible_module_compression' from source: unknown 18714 1726853408.23722: variable 'ansible_shell_type' from source: unknown 18714 1726853408.23813: variable 'ansible_shell_executable' from source: unknown 18714 1726853408.23817: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.23820: variable 'ansible_pipelining' from source: unknown 18714 1726853408.23822: variable 'ansible_timeout' from source: unknown 18714 1726853408.23824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.23942: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853408.23945: variable 'omit' from source: magic vars 18714 1726853408.23956: starting attempt loop 18714 1726853408.24030: running the handler 18714 1726853408.24034: handler run complete 18714 1726853408.24060: attempt loop complete, returning result 18714 1726853408.24068: _execute() done 18714 1726853408.24078: dumping result to json 18714 1726853408.24085: done dumping result, returning 18714 1726853408.24096: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [02083763-bbaf-e784-4f7d-00000000000b] 18714 1726853408.24104: sending task result for task 02083763-bbaf-e784-4f7d-00000000000b ok: [managed_node1] => {} MSG: Inside ethernet tests 18714 1726853408.24274: no more pending results, returning what we have 18714 1726853408.24278: results queue empty 18714 1726853408.24279: checking for any_errors_fatal 18714 1726853408.24281: done checking for any_errors_fatal 18714 1726853408.24282: checking for max_fail_percentage 18714 1726853408.24283: done checking for max_fail_percentage 18714 1726853408.24284: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.24285: done checking to see if all hosts have failed 18714 1726853408.24285: getting the remaining hosts for this loop 18714 1726853408.24286: done getting the remaining hosts for this loop 18714 1726853408.24289: getting the next task for host managed_node1 18714 1726853408.24295: done getting next task for host managed_node1 18714 1726853408.24298: ^ task is: TASK: Show network_provider 18714 1726853408.24299: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.24302: getting variables 18714 1726853408.24304: in VariableManager get_vars() 18714 1726853408.24330: Calling all_inventory to load vars for managed_node1 18714 1726853408.24333: Calling groups_inventory to load vars for managed_node1 18714 1726853408.24336: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.24345: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.24348: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.24353: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.24553: done sending task result for task 02083763-bbaf-e784-4f7d-00000000000b 18714 1726853408.24557: WORKER PROCESS EXITING 18714 1726853408.24573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.24760: done with get_vars() 18714 1726853408.24770: done getting variables 18714 1726853408.24832: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 13:30:08 -0400 (0:00:00.033) 0:00:04.632 ****** 18714 1726853408.24860: entering _queue_task() for managed_node1/debug 18714 1726853408.25108: worker is 1 (out of 1 available) 18714 1726853408.25119: exiting _queue_task() for managed_node1/debug 18714 1726853408.25131: done queuing things up, now waiting for results queue to drain 18714 1726853408.25132: waiting for pending results... 18714 1726853408.25370: running TaskExecutor() for managed_node1/TASK: Show network_provider 18714 1726853408.25467: in run() - task 02083763-bbaf-e784-4f7d-00000000000c 18714 1726853408.25576: variable 'ansible_search_path' from source: unknown 18714 1726853408.25580: calling self._execute() 18714 1726853408.25602: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.25613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.25627: variable 'omit' from source: magic vars 18714 1726853408.25987: variable 'ansible_distribution_major_version' from source: facts 18714 1726853408.26004: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853408.26019: variable 'omit' from source: magic vars 18714 1726853408.26056: variable 'omit' from source: magic vars 18714 1726853408.26096: variable 'omit' from source: magic vars 18714 1726853408.26142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853408.26186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853408.26210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853408.26236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853408.26255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853408.26343: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853408.26346: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.26351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.26412: Set connection var ansible_shell_executable to /bin/sh 18714 1726853408.26424: Set connection var ansible_timeout to 10 18714 1726853408.26434: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853408.26447: Set connection var ansible_connection to ssh 18714 1726853408.26464: Set connection var ansible_shell_type to sh 18714 1726853408.26476: Set connection var ansible_pipelining to False 18714 1726853408.26499: variable 'ansible_shell_executable' from source: unknown 18714 1726853408.26564: variable 'ansible_connection' from source: unknown 18714 1726853408.26568: variable 'ansible_module_compression' from source: unknown 18714 1726853408.26572: variable 'ansible_shell_type' from source: unknown 18714 1726853408.26574: variable 'ansible_shell_executable' from source: unknown 18714 1726853408.26576: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.26579: variable 'ansible_pipelining' from source: unknown 18714 1726853408.26581: variable 'ansible_timeout' from source: unknown 18714 1726853408.26583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.26762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853408.26782: variable 'omit' from source: magic vars 18714 1726853408.26792: starting attempt loop 18714 1726853408.26800: running the handler 18714 1726853408.26848: variable 'network_provider' from source: set_fact 18714 1726853408.26931: variable 'network_provider' from source: set_fact 18714 1726853408.26974: handler run complete 18714 1726853408.26977: attempt loop complete, returning result 18714 1726853408.26980: _execute() done 18714 1726853408.26983: dumping result to json 18714 1726853408.26985: done dumping result, returning 18714 1726853408.27076: done running TaskExecutor() for managed_node1/TASK: Show network_provider [02083763-bbaf-e784-4f7d-00000000000c] 18714 1726853408.27080: sending task result for task 02083763-bbaf-e784-4f7d-00000000000c 18714 1726853408.27147: done sending task result for task 02083763-bbaf-e784-4f7d-00000000000c 18714 1726853408.27153: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 18714 1726853408.27202: no more pending results, returning what we have 18714 1726853408.27205: results queue empty 18714 1726853408.27206: checking for any_errors_fatal 18714 1726853408.27214: done checking for any_errors_fatal 18714 1726853408.27215: checking for max_fail_percentage 18714 1726853408.27216: done checking for max_fail_percentage 18714 1726853408.27217: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.27218: done checking to see if all hosts have failed 18714 1726853408.27218: getting the remaining hosts for this loop 18714 1726853408.27219: done getting the remaining hosts for this loop 18714 1726853408.27223: getting the next task for host managed_node1 18714 1726853408.27230: done getting next task for host managed_node1 18714 1726853408.27232: ^ task is: TASK: meta (flush_handlers) 18714 1726853408.27234: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.27237: getting variables 18714 1726853408.27239: in VariableManager get_vars() 18714 1726853408.27267: Calling all_inventory to load vars for managed_node1 18714 1726853408.27272: Calling groups_inventory to load vars for managed_node1 18714 1726853408.27276: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.27286: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.27288: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.27291: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.27711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.27872: done with get_vars() 18714 1726853408.27879: done getting variables 18714 1726853408.27924: in VariableManager get_vars() 18714 1726853408.27929: Calling all_inventory to load vars for managed_node1 18714 1726853408.27931: Calling groups_inventory to load vars for managed_node1 18714 1726853408.27932: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.27936: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.27937: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.27939: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.28027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.28135: done with get_vars() 18714 1726853408.28144: done queuing things up, now waiting for results queue to drain 18714 1726853408.28145: results queue empty 18714 1726853408.28145: checking for any_errors_fatal 18714 1726853408.28147: done checking for any_errors_fatal 18714 1726853408.28147: checking for max_fail_percentage 18714 1726853408.28148: done checking for max_fail_percentage 18714 1726853408.28148: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.28151: done checking to see if all hosts have failed 18714 1726853408.28151: getting the remaining hosts for this loop 18714 1726853408.28152: done getting the remaining hosts for this loop 18714 1726853408.28154: getting the next task for host managed_node1 18714 1726853408.28161: done getting next task for host managed_node1 18714 1726853408.28162: ^ task is: TASK: meta (flush_handlers) 18714 1726853408.28163: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.28164: getting variables 18714 1726853408.28165: in VariableManager get_vars() 18714 1726853408.28170: Calling all_inventory to load vars for managed_node1 18714 1726853408.28173: Calling groups_inventory to load vars for managed_node1 18714 1726853408.28175: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.28178: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.28179: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.28181: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.28281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.28386: done with get_vars() 18714 1726853408.28392: done getting variables 18714 1726853408.28422: in VariableManager get_vars() 18714 1726853408.28428: Calling all_inventory to load vars for managed_node1 18714 1726853408.28430: Calling groups_inventory to load vars for managed_node1 18714 1726853408.28431: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.28434: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.28435: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.28437: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.28513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.28616: done with get_vars() 18714 1726853408.28624: done queuing things up, now waiting for results queue to drain 18714 1726853408.28625: results queue empty 18714 1726853408.28625: checking for any_errors_fatal 18714 1726853408.28627: done checking for any_errors_fatal 18714 1726853408.28627: checking for max_fail_percentage 18714 1726853408.28628: done checking for max_fail_percentage 18714 1726853408.28628: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.28629: done checking to see if all hosts have failed 18714 1726853408.28629: getting the remaining hosts for this loop 18714 1726853408.28630: done getting the remaining hosts for this loop 18714 1726853408.28632: getting the next task for host managed_node1 18714 1726853408.28634: done getting next task for host managed_node1 18714 1726853408.28635: ^ task is: None 18714 1726853408.28636: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.28636: done queuing things up, now waiting for results queue to drain 18714 1726853408.28637: results queue empty 18714 1726853408.28638: checking for any_errors_fatal 18714 1726853408.28638: done checking for any_errors_fatal 18714 1726853408.28639: checking for max_fail_percentage 18714 1726853408.28640: done checking for max_fail_percentage 18714 1726853408.28640: checking to see if all hosts have failed and the running result is not ok 18714 1726853408.28641: done checking to see if all hosts have failed 18714 1726853408.28642: getting the next task for host managed_node1 18714 1726853408.28644: done getting next task for host managed_node1 18714 1726853408.28644: ^ task is: None 18714 1726853408.28645: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.28679: in VariableManager get_vars() 18714 1726853408.28690: done with get_vars() 18714 1726853408.28694: in VariableManager get_vars() 18714 1726853408.28699: done with get_vars() 18714 1726853408.28702: variable 'omit' from source: magic vars 18714 1726853408.28721: in VariableManager get_vars() 18714 1726853408.28726: done with get_vars() 18714 1726853408.28738: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18714 1726853408.28855: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853408.28875: getting the remaining hosts for this loop 18714 1726853408.28876: done getting the remaining hosts for this loop 18714 1726853408.28878: getting the next task for host managed_node1 18714 1726853408.28880: done getting next task for host managed_node1 18714 1726853408.28881: ^ task is: TASK: Gathering Facts 18714 1726853408.28882: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853408.28883: getting variables 18714 1726853408.28884: in VariableManager get_vars() 18714 1726853408.28889: Calling all_inventory to load vars for managed_node1 18714 1726853408.28890: Calling groups_inventory to load vars for managed_node1 18714 1726853408.28891: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853408.28894: Calling all_plugins_play to load vars for managed_node1 18714 1726853408.28896: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853408.28897: Calling groups_plugins_play to load vars for managed_node1 18714 1726853408.29000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853408.29104: done with get_vars() 18714 1726853408.29109: done getting variables 18714 1726853408.29133: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 13:30:08 -0400 (0:00:00.042) 0:00:04.674 ****** 18714 1726853408.29152: entering _queue_task() for managed_node1/gather_facts 18714 1726853408.29348: worker is 1 (out of 1 available) 18714 1726853408.29364: exiting _queue_task() for managed_node1/gather_facts 18714 1726853408.29376: done queuing things up, now waiting for results queue to drain 18714 1726853408.29377: waiting for pending results... 18714 1726853408.29518: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853408.29589: in run() - task 02083763-bbaf-e784-4f7d-0000000000f0 18714 1726853408.29638: variable 'ansible_search_path' from source: unknown 18714 1726853408.29691: calling self._execute() 18714 1726853408.29721: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.29778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.29782: variable 'omit' from source: magic vars 18714 1726853408.30104: variable 'ansible_distribution_major_version' from source: facts 18714 1726853408.30119: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853408.30128: variable 'omit' from source: magic vars 18714 1726853408.30158: variable 'omit' from source: magic vars 18714 1726853408.30198: variable 'omit' from source: magic vars 18714 1726853408.30238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853408.30377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853408.30380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853408.30383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853408.30385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853408.30387: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853408.30389: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.30391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.30479: Set connection var ansible_shell_executable to /bin/sh 18714 1726853408.30492: Set connection var ansible_timeout to 10 18714 1726853408.30502: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853408.30514: Set connection var ansible_connection to ssh 18714 1726853408.30524: Set connection var ansible_shell_type to sh 18714 1726853408.30534: Set connection var ansible_pipelining to False 18714 1726853408.30562: variable 'ansible_shell_executable' from source: unknown 18714 1726853408.30570: variable 'ansible_connection' from source: unknown 18714 1726853408.30580: variable 'ansible_module_compression' from source: unknown 18714 1726853408.30587: variable 'ansible_shell_type' from source: unknown 18714 1726853408.30593: variable 'ansible_shell_executable' from source: unknown 18714 1726853408.30601: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853408.30609: variable 'ansible_pipelining' from source: unknown 18714 1726853408.30616: variable 'ansible_timeout' from source: unknown 18714 1726853408.30624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853408.30799: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853408.30812: variable 'omit' from source: magic vars 18714 1726853408.30816: starting attempt loop 18714 1726853408.30818: running the handler 18714 1726853408.30831: variable 'ansible_facts' from source: unknown 18714 1726853408.30854: _low_level_execute_command(): starting 18714 1726853408.30861: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853408.31356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853408.31360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853408.31363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853408.31366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853408.31417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853408.31422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853408.31424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853408.31469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853408.33186: stdout chunk (state=3): >>>/root <<< 18714 1726853408.33296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853408.33300: stdout chunk (state=3): >>><<< 18714 1726853408.33302: stderr chunk (state=3): >>><<< 18714 1726853408.33424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853408.33428: _low_level_execute_command(): starting 18714 1726853408.33431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100 `" && echo ansible-tmp-1726853408.333298-18942-47325954614100="` echo /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100 `" ) && sleep 0' 18714 1726853408.33973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853408.33977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853408.33979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853408.33988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853408.34032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853408.34036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853408.34084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853408.36006: stdout chunk (state=3): >>>ansible-tmp-1726853408.333298-18942-47325954614100=/root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100 <<< 18714 1726853408.36065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853408.36106: stderr chunk (state=3): >>><<< 18714 1726853408.36115: stdout chunk (state=3): >>><<< 18714 1726853408.36215: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853408.333298-18942-47325954614100=/root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853408.36219: variable 'ansible_module_compression' from source: unknown 18714 1726853408.36221: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853408.36268: variable 'ansible_facts' from source: unknown 18714 1726853408.36475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py 18714 1726853408.36667: Sending initial data 18714 1726853408.36670: Sent initial data (152 bytes) 18714 1726853408.37294: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853408.37341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853408.37361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853408.37393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853408.37518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853408.39074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853408.39108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853408.39162: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpcb0pkvm6 /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py <<< 18714 1726853408.39174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py" <<< 18714 1726853408.39230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpcb0pkvm6" to remote "/root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py" <<< 18714 1726853408.40984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853408.41129: stdout chunk (state=3): >>><<< 18714 1726853408.41132: stderr chunk (state=3): >>><<< 18714 1726853408.41152: done transferring module to remote 18714 1726853408.41278: _low_level_execute_command(): starting 18714 1726853408.41282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/ /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py && sleep 0' 18714 1726853408.42025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853408.42038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853408.42048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853408.42067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853408.42156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853408.42167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853408.42240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853408.44037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853408.44053: stdout chunk (state=3): >>><<< 18714 1726853408.44067: stderr chunk (state=3): >>><<< 18714 1726853408.44100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853408.44109: _low_level_execute_command(): starting 18714 1726853408.44120: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/AnsiballZ_setup.py && sleep 0' 18714 1726853408.44758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853408.44772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853408.44784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853408.44799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853408.44813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853408.44832: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853408.44843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853408.44860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853408.44870: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853408.44944: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853408.44970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853408.44988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853408.45007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853408.45079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.08698: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 574, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794889728, "block_size": 4096, "block_total": 65519099, "block_available": 63914768, "block_used": 1604331, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.46533203125, "5m": 0.36669921875, "15m": 0.17333984375}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "09", "epoch": "1726853409", "epoch_int": "1726853409", "date": "2024-09-20", "time": "13:30:09", "iso8601_micro": "2024-09-20T17:30:09.084023Z", "iso8601": "2024-09-20T17:30:09Z", "iso8601_basic": "20240920T133009084023", "iso8601_basic_short": "20240920T133009", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853409.10725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853409.10749: stderr chunk (state=3): >>><<< 18714 1726853409.10753: stdout chunk (state=3): >>><<< 18714 1726853409.10785: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 574, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794889728, "block_size": 4096, "block_total": 65519099, "block_available": 63914768, "block_used": 1604331, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.46533203125, "5m": 0.36669921875, "15m": 0.17333984375}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "09", "epoch": "1726853409", "epoch_int": "1726853409", "date": "2024-09-20", "time": "13:30:09", "iso8601_micro": "2024-09-20T17:30:09.084023Z", "iso8601": "2024-09-20T17:30:09Z", "iso8601_basic": "20240920T133009084023", "iso8601_basic_short": "20240920T133009", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853409.10993: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853409.11011: _low_level_execute_command(): starting 18714 1726853409.11015: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853408.333298-18942-47325954614100/ > /dev/null 2>&1 && sleep 0' 18714 1726853409.11458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853409.11461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.11463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.11465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853409.11467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.11518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.11521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.11523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.11566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.13422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.13449: stderr chunk (state=3): >>><<< 18714 1726853409.13452: stdout chunk (state=3): >>><<< 18714 1726853409.13468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.13477: handler run complete 18714 1726853409.13556: variable 'ansible_facts' from source: unknown 18714 1726853409.13625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.13802: variable 'ansible_facts' from source: unknown 18714 1726853409.13858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.13930: attempt loop complete, returning result 18714 1726853409.13934: _execute() done 18714 1726853409.13936: dumping result to json 18714 1726853409.13960: done dumping result, returning 18714 1726853409.13967: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-0000000000f0] 18714 1726853409.13972: sending task result for task 02083763-bbaf-e784-4f7d-0000000000f0 18714 1726853409.14208: done sending task result for task 02083763-bbaf-e784-4f7d-0000000000f0 18714 1726853409.14211: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853409.14460: no more pending results, returning what we have 18714 1726853409.14463: results queue empty 18714 1726853409.14464: checking for any_errors_fatal 18714 1726853409.14466: done checking for any_errors_fatal 18714 1726853409.14466: checking for max_fail_percentage 18714 1726853409.14468: done checking for max_fail_percentage 18714 1726853409.14469: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.14470: done checking to see if all hosts have failed 18714 1726853409.14485: getting the remaining hosts for this loop 18714 1726853409.14487: done getting the remaining hosts for this loop 18714 1726853409.14490: getting the next task for host managed_node1 18714 1726853409.14495: done getting next task for host managed_node1 18714 1726853409.14515: ^ task is: TASK: meta (flush_handlers) 18714 1726853409.14517: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.14521: getting variables 18714 1726853409.14522: in VariableManager get_vars() 18714 1726853409.14543: Calling all_inventory to load vars for managed_node1 18714 1726853409.14546: Calling groups_inventory to load vars for managed_node1 18714 1726853409.14549: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.14559: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.14562: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.14565: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.14742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.14930: done with get_vars() 18714 1726853409.14946: done getting variables 18714 1726853409.15014: in VariableManager get_vars() 18714 1726853409.15023: Calling all_inventory to load vars for managed_node1 18714 1726853409.15025: Calling groups_inventory to load vars for managed_node1 18714 1726853409.15028: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.15032: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.15034: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.15037: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.15217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.15452: done with get_vars() 18714 1726853409.15462: done queuing things up, now waiting for results queue to drain 18714 1726853409.15463: results queue empty 18714 1726853409.15464: checking for any_errors_fatal 18714 1726853409.15466: done checking for any_errors_fatal 18714 1726853409.15490: checking for max_fail_percentage 18714 1726853409.15492: done checking for max_fail_percentage 18714 1726853409.15493: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.15493: done checking to see if all hosts have failed 18714 1726853409.15494: getting the remaining hosts for this loop 18714 1726853409.15495: done getting the remaining hosts for this loop 18714 1726853409.15502: getting the next task for host managed_node1 18714 1726853409.15505: done getting next task for host managed_node1 18714 1726853409.15507: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18714 1726853409.15508: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.15510: getting variables 18714 1726853409.15510: in VariableManager get_vars() 18714 1726853409.15516: Calling all_inventory to load vars for managed_node1 18714 1726853409.15517: Calling groups_inventory to load vars for managed_node1 18714 1726853409.15518: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.15521: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.15523: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.15524: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.15606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.15715: done with get_vars() 18714 1726853409.15721: done getting variables 18714 1726853409.15748: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853409.15854: variable 'type' from source: play vars 18714 1726853409.15858: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 13:30:09 -0400 (0:00:00.867) 0:00:05.542 ****** 18714 1726853409.15888: entering _queue_task() for managed_node1/set_fact 18714 1726853409.16111: worker is 1 (out of 1 available) 18714 1726853409.16124: exiting _queue_task() for managed_node1/set_fact 18714 1726853409.16135: done queuing things up, now waiting for results queue to drain 18714 1726853409.16136: waiting for pending results... 18714 1726853409.16282: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 18714 1726853409.16328: in run() - task 02083763-bbaf-e784-4f7d-00000000000f 18714 1726853409.16340: variable 'ansible_search_path' from source: unknown 18714 1726853409.16374: calling self._execute() 18714 1726853409.16431: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.16435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.16444: variable 'omit' from source: magic vars 18714 1726853409.16696: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.16706: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.16712: variable 'omit' from source: magic vars 18714 1726853409.16731: variable 'omit' from source: magic vars 18714 1726853409.16754: variable 'type' from source: play vars 18714 1726853409.16804: variable 'type' from source: play vars 18714 1726853409.16812: variable 'interface' from source: play vars 18714 1726853409.16855: variable 'interface' from source: play vars 18714 1726853409.16865: variable 'omit' from source: magic vars 18714 1726853409.16899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853409.16926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853409.16942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853409.16955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.16966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.16990: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853409.16994: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.16996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.17066: Set connection var ansible_shell_executable to /bin/sh 18714 1726853409.17072: Set connection var ansible_timeout to 10 18714 1726853409.17078: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853409.17084: Set connection var ansible_connection to ssh 18714 1726853409.17089: Set connection var ansible_shell_type to sh 18714 1726853409.17094: Set connection var ansible_pipelining to False 18714 1726853409.17109: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.17111: variable 'ansible_connection' from source: unknown 18714 1726853409.17114: variable 'ansible_module_compression' from source: unknown 18714 1726853409.17116: variable 'ansible_shell_type' from source: unknown 18714 1726853409.17120: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.17122: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.17124: variable 'ansible_pipelining' from source: unknown 18714 1726853409.17127: variable 'ansible_timeout' from source: unknown 18714 1726853409.17136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.17229: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853409.17238: variable 'omit' from source: magic vars 18714 1726853409.17241: starting attempt loop 18714 1726853409.17244: running the handler 18714 1726853409.17256: handler run complete 18714 1726853409.17264: attempt loop complete, returning result 18714 1726853409.17266: _execute() done 18714 1726853409.17269: dumping result to json 18714 1726853409.17273: done dumping result, returning 18714 1726853409.17279: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [02083763-bbaf-e784-4f7d-00000000000f] 18714 1726853409.17284: sending task result for task 02083763-bbaf-e784-4f7d-00000000000f 18714 1726853409.17360: done sending task result for task 02083763-bbaf-e784-4f7d-00000000000f 18714 1726853409.17363: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 18714 1726853409.17412: no more pending results, returning what we have 18714 1726853409.17415: results queue empty 18714 1726853409.17417: checking for any_errors_fatal 18714 1726853409.17419: done checking for any_errors_fatal 18714 1726853409.17419: checking for max_fail_percentage 18714 1726853409.17421: done checking for max_fail_percentage 18714 1726853409.17422: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.17423: done checking to see if all hosts have failed 18714 1726853409.17423: getting the remaining hosts for this loop 18714 1726853409.17424: done getting the remaining hosts for this loop 18714 1726853409.17428: getting the next task for host managed_node1 18714 1726853409.17434: done getting next task for host managed_node1 18714 1726853409.17436: ^ task is: TASK: Include the task 'show_interfaces.yml' 18714 1726853409.17438: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.17442: getting variables 18714 1726853409.17443: in VariableManager get_vars() 18714 1726853409.17473: Calling all_inventory to load vars for managed_node1 18714 1726853409.17476: Calling groups_inventory to load vars for managed_node1 18714 1726853409.17482: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.17494: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.17496: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.17502: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.17735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.18022: done with get_vars() 18714 1726853409.18031: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 13:30:09 -0400 (0:00:00.022) 0:00:05.564 ****** 18714 1726853409.18119: entering _queue_task() for managed_node1/include_tasks 18714 1726853409.18359: worker is 1 (out of 1 available) 18714 1726853409.18370: exiting _queue_task() for managed_node1/include_tasks 18714 1726853409.18382: done queuing things up, now waiting for results queue to drain 18714 1726853409.18383: waiting for pending results... 18714 1726853409.18786: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18714 1726853409.18791: in run() - task 02083763-bbaf-e784-4f7d-000000000010 18714 1726853409.18793: variable 'ansible_search_path' from source: unknown 18714 1726853409.18795: calling self._execute() 18714 1726853409.18821: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.18832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.18845: variable 'omit' from source: magic vars 18714 1726853409.19190: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.19206: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.19215: _execute() done 18714 1726853409.19222: dumping result to json 18714 1726853409.19229: done dumping result, returning 18714 1726853409.19243: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-e784-4f7d-000000000010] 18714 1726853409.19251: sending task result for task 02083763-bbaf-e784-4f7d-000000000010 18714 1726853409.19478: done sending task result for task 02083763-bbaf-e784-4f7d-000000000010 18714 1726853409.19481: WORKER PROCESS EXITING 18714 1726853409.19505: no more pending results, returning what we have 18714 1726853409.19509: in VariableManager get_vars() 18714 1726853409.19541: Calling all_inventory to load vars for managed_node1 18714 1726853409.19544: Calling groups_inventory to load vars for managed_node1 18714 1726853409.19547: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.19559: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.19562: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.19565: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.19834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.20015: done with get_vars() 18714 1726853409.20023: variable 'ansible_search_path' from source: unknown 18714 1726853409.20037: we have included files to process 18714 1726853409.20038: generating all_blocks data 18714 1726853409.20039: done generating all_blocks data 18714 1726853409.20040: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18714 1726853409.20041: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18714 1726853409.20043: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18714 1726853409.20189: in VariableManager get_vars() 18714 1726853409.20203: done with get_vars() 18714 1726853409.20305: done processing included file 18714 1726853409.20307: iterating over new_blocks loaded from include file 18714 1726853409.20308: in VariableManager get_vars() 18714 1726853409.20318: done with get_vars() 18714 1726853409.20320: filtering new block on tags 18714 1726853409.20335: done filtering new block on tags 18714 1726853409.20337: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 18714 1726853409.20341: extending task lists for all hosts with included blocks 18714 1726853409.20415: done extending task lists 18714 1726853409.20417: done processing included files 18714 1726853409.20418: results queue empty 18714 1726853409.20418: checking for any_errors_fatal 18714 1726853409.20421: done checking for any_errors_fatal 18714 1726853409.20422: checking for max_fail_percentage 18714 1726853409.20423: done checking for max_fail_percentage 18714 1726853409.20424: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.20425: done checking to see if all hosts have failed 18714 1726853409.20425: getting the remaining hosts for this loop 18714 1726853409.20426: done getting the remaining hosts for this loop 18714 1726853409.20429: getting the next task for host managed_node1 18714 1726853409.20432: done getting next task for host managed_node1 18714 1726853409.20434: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18714 1726853409.20437: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.20439: getting variables 18714 1726853409.20651: in VariableManager get_vars() 18714 1726853409.20661: Calling all_inventory to load vars for managed_node1 18714 1726853409.20663: Calling groups_inventory to load vars for managed_node1 18714 1726853409.20666: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.20670: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.20674: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.20677: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.20809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.20997: done with get_vars() 18714 1726853409.21006: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:30:09 -0400 (0:00:00.029) 0:00:05.594 ****** 18714 1726853409.21074: entering _queue_task() for managed_node1/include_tasks 18714 1726853409.21344: worker is 1 (out of 1 available) 18714 1726853409.21358: exiting _queue_task() for managed_node1/include_tasks 18714 1726853409.21370: done queuing things up, now waiting for results queue to drain 18714 1726853409.21573: waiting for pending results... 18714 1726853409.21616: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 18714 1726853409.21735: in run() - task 02083763-bbaf-e784-4f7d-000000000104 18714 1726853409.21756: variable 'ansible_search_path' from source: unknown 18714 1726853409.21764: variable 'ansible_search_path' from source: unknown 18714 1726853409.21813: calling self._execute() 18714 1726853409.21896: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.21911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.21926: variable 'omit' from source: magic vars 18714 1726853409.22342: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.22346: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.22350: _execute() done 18714 1726853409.22352: dumping result to json 18714 1726853409.22355: done dumping result, returning 18714 1726853409.22357: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-e784-4f7d-000000000104] 18714 1726853409.22360: sending task result for task 02083763-bbaf-e784-4f7d-000000000104 18714 1726853409.22477: no more pending results, returning what we have 18714 1726853409.22484: in VariableManager get_vars() 18714 1726853409.22524: Calling all_inventory to load vars for managed_node1 18714 1726853409.22527: Calling groups_inventory to load vars for managed_node1 18714 1726853409.22530: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.22545: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.22548: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.22551: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.23029: done sending task result for task 02083763-bbaf-e784-4f7d-000000000104 18714 1726853409.23033: WORKER PROCESS EXITING 18714 1726853409.23057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.23283: done with get_vars() 18714 1726853409.23297: variable 'ansible_search_path' from source: unknown 18714 1726853409.23299: variable 'ansible_search_path' from source: unknown 18714 1726853409.23336: we have included files to process 18714 1726853409.23337: generating all_blocks data 18714 1726853409.23338: done generating all_blocks data 18714 1726853409.23340: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18714 1726853409.23341: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18714 1726853409.23343: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18714 1726853409.23767: done processing included file 18714 1726853409.23769: iterating over new_blocks loaded from include file 18714 1726853409.23773: in VariableManager get_vars() 18714 1726853409.23784: done with get_vars() 18714 1726853409.23785: filtering new block on tags 18714 1726853409.23799: done filtering new block on tags 18714 1726853409.23801: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 18714 1726853409.23805: extending task lists for all hosts with included blocks 18714 1726853409.23885: done extending task lists 18714 1726853409.23887: done processing included files 18714 1726853409.23887: results queue empty 18714 1726853409.23888: checking for any_errors_fatal 18714 1726853409.23891: done checking for any_errors_fatal 18714 1726853409.23892: checking for max_fail_percentage 18714 1726853409.23893: done checking for max_fail_percentage 18714 1726853409.23894: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.23894: done checking to see if all hosts have failed 18714 1726853409.23895: getting the remaining hosts for this loop 18714 1726853409.23896: done getting the remaining hosts for this loop 18714 1726853409.23899: getting the next task for host managed_node1 18714 1726853409.23902: done getting next task for host managed_node1 18714 1726853409.23905: ^ task is: TASK: Gather current interface info 18714 1726853409.23907: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.23909: getting variables 18714 1726853409.23910: in VariableManager get_vars() 18714 1726853409.23918: Calling all_inventory to load vars for managed_node1 18714 1726853409.23921: Calling groups_inventory to load vars for managed_node1 18714 1726853409.23923: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.23927: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.23930: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.23933: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.24062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.24239: done with get_vars() 18714 1726853409.24247: done getting variables 18714 1726853409.24285: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:30:09 -0400 (0:00:00.032) 0:00:05.626 ****** 18714 1726853409.24312: entering _queue_task() for managed_node1/command 18714 1726853409.24587: worker is 1 (out of 1 available) 18714 1726853409.24599: exiting _queue_task() for managed_node1/command 18714 1726853409.24611: done queuing things up, now waiting for results queue to drain 18714 1726853409.24612: waiting for pending results... 18714 1726853409.24851: running TaskExecutor() for managed_node1/TASK: Gather current interface info 18714 1726853409.24954: in run() - task 02083763-bbaf-e784-4f7d-000000000115 18714 1726853409.24980: variable 'ansible_search_path' from source: unknown 18714 1726853409.24990: variable 'ansible_search_path' from source: unknown 18714 1726853409.25027: calling self._execute() 18714 1726853409.25108: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.25119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.25132: variable 'omit' from source: magic vars 18714 1726853409.25551: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.25567: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.25579: variable 'omit' from source: magic vars 18714 1726853409.25631: variable 'omit' from source: magic vars 18714 1726853409.25669: variable 'omit' from source: magic vars 18714 1726853409.25713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853409.25755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853409.25780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853409.25801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.25816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.25854: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853409.25862: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.25870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.25974: Set connection var ansible_shell_executable to /bin/sh 18714 1726853409.25986: Set connection var ansible_timeout to 10 18714 1726853409.25996: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853409.26006: Set connection var ansible_connection to ssh 18714 1726853409.26015: Set connection var ansible_shell_type to sh 18714 1726853409.26023: Set connection var ansible_pipelining to False 18714 1726853409.26044: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.26056: variable 'ansible_connection' from source: unknown 18714 1726853409.26063: variable 'ansible_module_compression' from source: unknown 18714 1726853409.26070: variable 'ansible_shell_type' from source: unknown 18714 1726853409.26078: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.26084: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.26091: variable 'ansible_pipelining' from source: unknown 18714 1726853409.26097: variable 'ansible_timeout' from source: unknown 18714 1726853409.26103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.26235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853409.26250: variable 'omit' from source: magic vars 18714 1726853409.26258: starting attempt loop 18714 1726853409.26269: running the handler 18714 1726853409.26376: _low_level_execute_command(): starting 18714 1726853409.26379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853409.27006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853409.27045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.27067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853409.27157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.27174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.27198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.27284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.29112: stdout chunk (state=3): >>>/root <<< 18714 1726853409.29188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.29192: stdout chunk (state=3): >>><<< 18714 1726853409.29194: stderr chunk (state=3): >>><<< 18714 1726853409.29213: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.29324: _low_level_execute_command(): starting 18714 1726853409.29335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294 `" && echo ansible-tmp-1726853409.2931015-18988-160304141570294="` echo /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294 `" ) && sleep 0' 18714 1726853409.29938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853409.29955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.29976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853409.29994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853409.30094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.30118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.30142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.30168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.30232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.32154: stdout chunk (state=3): >>>ansible-tmp-1726853409.2931015-18988-160304141570294=/root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294 <<< 18714 1726853409.32299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.32311: stdout chunk (state=3): >>><<< 18714 1726853409.32323: stderr chunk (state=3): >>><<< 18714 1726853409.32348: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853409.2931015-18988-160304141570294=/root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.32397: variable 'ansible_module_compression' from source: unknown 18714 1726853409.32460: ANSIBALLZ: Using generic lock for ansible.legacy.command 18714 1726853409.32478: ANSIBALLZ: Acquiring lock 18714 1726853409.32576: ANSIBALLZ: Lock acquired: 139791971422656 18714 1726853409.32579: ANSIBALLZ: Creating module 18714 1726853409.46485: ANSIBALLZ: Writing module into payload 18714 1726853409.46589: ANSIBALLZ: Writing module 18714 1726853409.46614: ANSIBALLZ: Renaming module 18714 1726853409.46625: ANSIBALLZ: Done creating module 18714 1726853409.46646: variable 'ansible_facts' from source: unknown 18714 1726853409.46731: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py 18714 1726853409.46928: Sending initial data 18714 1726853409.46931: Sent initial data (156 bytes) 18714 1726853409.47546: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853409.47576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.47688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.47709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.47725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.47747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.47836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.49529: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18714 1726853409.49540: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853409.49567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853409.49608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpja47xv4t /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py <<< 18714 1726853409.49611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py" <<< 18714 1726853409.49644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpja47xv4t" to remote "/root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py" <<< 18714 1726853409.49648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py" <<< 18714 1726853409.50151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.50193: stderr chunk (state=3): >>><<< 18714 1726853409.50196: stdout chunk (state=3): >>><<< 18714 1726853409.50242: done transferring module to remote 18714 1726853409.50253: _low_level_execute_command(): starting 18714 1726853409.50259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/ /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py && sleep 0' 18714 1726853409.50661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.50695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853409.50699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.50701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.50703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853409.50709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.50753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.50758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.50798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.52560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.52585: stderr chunk (state=3): >>><<< 18714 1726853409.52587: stdout chunk (state=3): >>><<< 18714 1726853409.52597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.52614: _low_level_execute_command(): starting 18714 1726853409.52617: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/AnsiballZ_command.py && sleep 0' 18714 1726853409.53029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853409.53032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.53035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.53037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853409.53038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.53087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.53090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.53138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.68477: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:30:09.680534", "end": "2024-09-20 13:30:09.683881", "delta": "0:00:00.003347", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853409.70032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853409.70059: stderr chunk (state=3): >>><<< 18714 1726853409.70062: stdout chunk (state=3): >>><<< 18714 1726853409.70079: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:30:09.680534", "end": "2024-09-20 13:30:09.683881", "delta": "0:00:00.003347", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853409.70106: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853409.70114: _low_level_execute_command(): starting 18714 1726853409.70123: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853409.2931015-18988-160304141570294/ > /dev/null 2>&1 && sleep 0' 18714 1726853409.70553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853409.70590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853409.70593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.70597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.70605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853409.70607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853409.70652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.70659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.70661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.70699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.72489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.72516: stderr chunk (state=3): >>><<< 18714 1726853409.72519: stdout chunk (state=3): >>><<< 18714 1726853409.72537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.72540: handler run complete 18714 1726853409.72558: Evaluated conditional (False): False 18714 1726853409.72566: attempt loop complete, returning result 18714 1726853409.72569: _execute() done 18714 1726853409.72573: dumping result to json 18714 1726853409.72579: done dumping result, returning 18714 1726853409.72586: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-e784-4f7d-000000000115] 18714 1726853409.72589: sending task result for task 02083763-bbaf-e784-4f7d-000000000115 18714 1726853409.72684: done sending task result for task 02083763-bbaf-e784-4f7d-000000000115 18714 1726853409.72687: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003347", "end": "2024-09-20 13:30:09.683881", "rc": 0, "start": "2024-09-20 13:30:09.680534" } STDOUT: bonding_masters eth0 lo 18714 1726853409.72757: no more pending results, returning what we have 18714 1726853409.72760: results queue empty 18714 1726853409.72761: checking for any_errors_fatal 18714 1726853409.72763: done checking for any_errors_fatal 18714 1726853409.72763: checking for max_fail_percentage 18714 1726853409.72765: done checking for max_fail_percentage 18714 1726853409.72766: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.72766: done checking to see if all hosts have failed 18714 1726853409.72767: getting the remaining hosts for this loop 18714 1726853409.72768: done getting the remaining hosts for this loop 18714 1726853409.72773: getting the next task for host managed_node1 18714 1726853409.72779: done getting next task for host managed_node1 18714 1726853409.72781: ^ task is: TASK: Set current_interfaces 18714 1726853409.72785: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.72788: getting variables 18714 1726853409.72852: in VariableManager get_vars() 18714 1726853409.72880: Calling all_inventory to load vars for managed_node1 18714 1726853409.72883: Calling groups_inventory to load vars for managed_node1 18714 1726853409.72886: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.72895: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.72897: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.72899: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.73024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.73138: done with get_vars() 18714 1726853409.73145: done getting variables 18714 1726853409.73190: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:30:09 -0400 (0:00:00.488) 0:00:06.115 ****** 18714 1726853409.73213: entering _queue_task() for managed_node1/set_fact 18714 1726853409.73412: worker is 1 (out of 1 available) 18714 1726853409.73424: exiting _queue_task() for managed_node1/set_fact 18714 1726853409.73436: done queuing things up, now waiting for results queue to drain 18714 1726853409.73437: waiting for pending results... 18714 1726853409.73582: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 18714 1726853409.73646: in run() - task 02083763-bbaf-e784-4f7d-000000000116 18714 1726853409.73660: variable 'ansible_search_path' from source: unknown 18714 1726853409.73664: variable 'ansible_search_path' from source: unknown 18714 1726853409.73694: calling self._execute() 18714 1726853409.73751: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.73755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.73761: variable 'omit' from source: magic vars 18714 1726853409.74021: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.74031: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.74036: variable 'omit' from source: magic vars 18714 1726853409.74066: variable 'omit' from source: magic vars 18714 1726853409.74140: variable '_current_interfaces' from source: set_fact 18714 1726853409.74186: variable 'omit' from source: magic vars 18714 1726853409.74219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853409.74246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853409.74263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853409.74277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.74287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.74311: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853409.74314: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.74318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.74385: Set connection var ansible_shell_executable to /bin/sh 18714 1726853409.74391: Set connection var ansible_timeout to 10 18714 1726853409.74396: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853409.74403: Set connection var ansible_connection to ssh 18714 1726853409.74407: Set connection var ansible_shell_type to sh 18714 1726853409.74412: Set connection var ansible_pipelining to False 18714 1726853409.74429: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.74432: variable 'ansible_connection' from source: unknown 18714 1726853409.74435: variable 'ansible_module_compression' from source: unknown 18714 1726853409.74437: variable 'ansible_shell_type' from source: unknown 18714 1726853409.74440: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.74442: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.74444: variable 'ansible_pipelining' from source: unknown 18714 1726853409.74446: variable 'ansible_timeout' from source: unknown 18714 1726853409.74448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.74545: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853409.74553: variable 'omit' from source: magic vars 18714 1726853409.74570: starting attempt loop 18714 1726853409.74576: running the handler 18714 1726853409.74579: handler run complete 18714 1726853409.74585: attempt loop complete, returning result 18714 1726853409.74587: _execute() done 18714 1726853409.74590: dumping result to json 18714 1726853409.74592: done dumping result, returning 18714 1726853409.74599: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-e784-4f7d-000000000116] 18714 1726853409.74602: sending task result for task 02083763-bbaf-e784-4f7d-000000000116 18714 1726853409.74686: done sending task result for task 02083763-bbaf-e784-4f7d-000000000116 18714 1726853409.74689: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18714 1726853409.74740: no more pending results, returning what we have 18714 1726853409.74743: results queue empty 18714 1726853409.74744: checking for any_errors_fatal 18714 1726853409.74754: done checking for any_errors_fatal 18714 1726853409.74755: checking for max_fail_percentage 18714 1726853409.74756: done checking for max_fail_percentage 18714 1726853409.74757: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.74757: done checking to see if all hosts have failed 18714 1726853409.74758: getting the remaining hosts for this loop 18714 1726853409.74759: done getting the remaining hosts for this loop 18714 1726853409.74762: getting the next task for host managed_node1 18714 1726853409.74769: done getting next task for host managed_node1 18714 1726853409.74773: ^ task is: TASK: Show current_interfaces 18714 1726853409.74775: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.74780: getting variables 18714 1726853409.74781: in VariableManager get_vars() 18714 1726853409.74803: Calling all_inventory to load vars for managed_node1 18714 1726853409.74806: Calling groups_inventory to load vars for managed_node1 18714 1726853409.74808: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.74816: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.74818: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.74821: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.74941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.75078: done with get_vars() 18714 1726853409.75085: done getting variables 18714 1726853409.75123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:30:09 -0400 (0:00:00.019) 0:00:06.134 ****** 18714 1726853409.75142: entering _queue_task() for managed_node1/debug 18714 1726853409.75329: worker is 1 (out of 1 available) 18714 1726853409.75342: exiting _queue_task() for managed_node1/debug 18714 1726853409.75355: done queuing things up, now waiting for results queue to drain 18714 1726853409.75356: waiting for pending results... 18714 1726853409.75492: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 18714 1726853409.75546: in run() - task 02083763-bbaf-e784-4f7d-000000000105 18714 1726853409.75557: variable 'ansible_search_path' from source: unknown 18714 1726853409.75561: variable 'ansible_search_path' from source: unknown 18714 1726853409.75593: calling self._execute() 18714 1726853409.75652: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.75657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.75661: variable 'omit' from source: magic vars 18714 1726853409.75936: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.75944: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.75952: variable 'omit' from source: magic vars 18714 1726853409.75976: variable 'omit' from source: magic vars 18714 1726853409.76043: variable 'current_interfaces' from source: set_fact 18714 1726853409.76063: variable 'omit' from source: magic vars 18714 1726853409.76093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853409.76122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853409.76135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853409.76147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.76158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.76182: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853409.76186: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.76188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.76253: Set connection var ansible_shell_executable to /bin/sh 18714 1726853409.76264: Set connection var ansible_timeout to 10 18714 1726853409.76269: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853409.76277: Set connection var ansible_connection to ssh 18714 1726853409.76282: Set connection var ansible_shell_type to sh 18714 1726853409.76287: Set connection var ansible_pipelining to False 18714 1726853409.76301: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.76305: variable 'ansible_connection' from source: unknown 18714 1726853409.76307: variable 'ansible_module_compression' from source: unknown 18714 1726853409.76309: variable 'ansible_shell_type' from source: unknown 18714 1726853409.76312: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.76314: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.76318: variable 'ansible_pipelining' from source: unknown 18714 1726853409.76320: variable 'ansible_timeout' from source: unknown 18714 1726853409.76324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.76420: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853409.76428: variable 'omit' from source: magic vars 18714 1726853409.76433: starting attempt loop 18714 1726853409.76436: running the handler 18714 1726853409.76475: handler run complete 18714 1726853409.76485: attempt loop complete, returning result 18714 1726853409.76487: _execute() done 18714 1726853409.76490: dumping result to json 18714 1726853409.76492: done dumping result, returning 18714 1726853409.76499: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-e784-4f7d-000000000105] 18714 1726853409.76502: sending task result for task 02083763-bbaf-e784-4f7d-000000000105 18714 1726853409.76582: done sending task result for task 02083763-bbaf-e784-4f7d-000000000105 18714 1726853409.76585: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18714 1726853409.76628: no more pending results, returning what we have 18714 1726853409.76631: results queue empty 18714 1726853409.76632: checking for any_errors_fatal 18714 1726853409.76639: done checking for any_errors_fatal 18714 1726853409.76640: checking for max_fail_percentage 18714 1726853409.76642: done checking for max_fail_percentage 18714 1726853409.76642: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.76643: done checking to see if all hosts have failed 18714 1726853409.76644: getting the remaining hosts for this loop 18714 1726853409.76645: done getting the remaining hosts for this loop 18714 1726853409.76648: getting the next task for host managed_node1 18714 1726853409.76657: done getting next task for host managed_node1 18714 1726853409.76659: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18714 1726853409.76661: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.76664: getting variables 18714 1726853409.76665: in VariableManager get_vars() 18714 1726853409.76690: Calling all_inventory to load vars for managed_node1 18714 1726853409.76692: Calling groups_inventory to load vars for managed_node1 18714 1726853409.76695: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.76702: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.76704: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.76707: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.76829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.76940: done with get_vars() 18714 1726853409.76947: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 13:30:09 -0400 (0:00:00.018) 0:00:06.153 ****** 18714 1726853409.77010: entering _queue_task() for managed_node1/include_tasks 18714 1726853409.77236: worker is 1 (out of 1 available) 18714 1726853409.77248: exiting _queue_task() for managed_node1/include_tasks 18714 1726853409.77261: done queuing things up, now waiting for results queue to drain 18714 1726853409.77262: waiting for pending results... 18714 1726853409.77688: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 18714 1726853409.77694: in run() - task 02083763-bbaf-e784-4f7d-000000000011 18714 1726853409.77696: variable 'ansible_search_path' from source: unknown 18714 1726853409.77698: calling self._execute() 18714 1726853409.77721: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.77731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.77745: variable 'omit' from source: magic vars 18714 1726853409.78102: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.78120: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.78130: _execute() done 18714 1726853409.78143: dumping result to json 18714 1726853409.78152: done dumping result, returning 18714 1726853409.78163: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-e784-4f7d-000000000011] 18714 1726853409.78177: sending task result for task 02083763-bbaf-e784-4f7d-000000000011 18714 1726853409.78309: no more pending results, returning what we have 18714 1726853409.78314: in VariableManager get_vars() 18714 1726853409.78346: Calling all_inventory to load vars for managed_node1 18714 1726853409.78349: Calling groups_inventory to load vars for managed_node1 18714 1726853409.78352: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.78364: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.78366: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.78369: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.78557: done sending task result for task 02083763-bbaf-e784-4f7d-000000000011 18714 1726853409.78561: WORKER PROCESS EXITING 18714 1726853409.78572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.78682: done with get_vars() 18714 1726853409.78687: variable 'ansible_search_path' from source: unknown 18714 1726853409.78695: we have included files to process 18714 1726853409.78696: generating all_blocks data 18714 1726853409.78697: done generating all_blocks data 18714 1726853409.78701: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18714 1726853409.78702: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18714 1726853409.78704: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18714 1726853409.79022: in VariableManager get_vars() 18714 1726853409.79035: done with get_vars() 18714 1726853409.79181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 18714 1726853409.79552: done processing included file 18714 1726853409.79554: iterating over new_blocks loaded from include file 18714 1726853409.79555: in VariableManager get_vars() 18714 1726853409.79562: done with get_vars() 18714 1726853409.79563: filtering new block on tags 18714 1726853409.79584: done filtering new block on tags 18714 1726853409.79586: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 18714 1726853409.79589: extending task lists for all hosts with included blocks 18714 1726853409.79686: done extending task lists 18714 1726853409.79687: done processing included files 18714 1726853409.79688: results queue empty 18714 1726853409.79688: checking for any_errors_fatal 18714 1726853409.79690: done checking for any_errors_fatal 18714 1726853409.79691: checking for max_fail_percentage 18714 1726853409.79691: done checking for max_fail_percentage 18714 1726853409.79692: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.79692: done checking to see if all hosts have failed 18714 1726853409.79693: getting the remaining hosts for this loop 18714 1726853409.79693: done getting the remaining hosts for this loop 18714 1726853409.79695: getting the next task for host managed_node1 18714 1726853409.79697: done getting next task for host managed_node1 18714 1726853409.79699: ^ task is: TASK: Ensure state in ["present", "absent"] 18714 1726853409.79700: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.79701: getting variables 18714 1726853409.79702: in VariableManager get_vars() 18714 1726853409.79707: Calling all_inventory to load vars for managed_node1 18714 1726853409.79708: Calling groups_inventory to load vars for managed_node1 18714 1726853409.79710: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.79713: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.79714: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.79716: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.79797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.79903: done with get_vars() 18714 1726853409.79911: done getting variables 18714 1726853409.79956: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:30:09 -0400 (0:00:00.029) 0:00:06.183 ****** 18714 1726853409.79976: entering _queue_task() for managed_node1/fail 18714 1726853409.79977: Creating lock for fail 18714 1726853409.80394: worker is 1 (out of 1 available) 18714 1726853409.80402: exiting _queue_task() for managed_node1/fail 18714 1726853409.80410: done queuing things up, now waiting for results queue to drain 18714 1726853409.80411: waiting for pending results... 18714 1726853409.80646: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 18714 1726853409.80667: in run() - task 02083763-bbaf-e784-4f7d-000000000131 18714 1726853409.80690: variable 'ansible_search_path' from source: unknown 18714 1726853409.80699: variable 'ansible_search_path' from source: unknown 18714 1726853409.80739: calling self._execute() 18714 1726853409.80818: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.80828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.80857: variable 'omit' from source: magic vars 18714 1726853409.81285: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.81289: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.81395: variable 'state' from source: include params 18714 1726853409.81405: Evaluated conditional (state not in ["present", "absent"]): False 18714 1726853409.81412: when evaluation is False, skipping this task 18714 1726853409.81419: _execute() done 18714 1726853409.81425: dumping result to json 18714 1726853409.81433: done dumping result, returning 18714 1726853409.81443: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-e784-4f7d-000000000131] 18714 1726853409.81454: sending task result for task 02083763-bbaf-e784-4f7d-000000000131 18714 1726853409.81720: done sending task result for task 02083763-bbaf-e784-4f7d-000000000131 18714 1726853409.81723: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 18714 1726853409.81760: no more pending results, returning what we have 18714 1726853409.81764: results queue empty 18714 1726853409.81764: checking for any_errors_fatal 18714 1726853409.81766: done checking for any_errors_fatal 18714 1726853409.81766: checking for max_fail_percentage 18714 1726853409.81768: done checking for max_fail_percentage 18714 1726853409.81769: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.81769: done checking to see if all hosts have failed 18714 1726853409.81770: getting the remaining hosts for this loop 18714 1726853409.81773: done getting the remaining hosts for this loop 18714 1726853409.81776: getting the next task for host managed_node1 18714 1726853409.81780: done getting next task for host managed_node1 18714 1726853409.81782: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 18714 1726853409.81785: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.81788: getting variables 18714 1726853409.81789: in VariableManager get_vars() 18714 1726853409.81814: Calling all_inventory to load vars for managed_node1 18714 1726853409.81817: Calling groups_inventory to load vars for managed_node1 18714 1726853409.81820: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.81830: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.81832: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.81835: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.82091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.82285: done with get_vars() 18714 1726853409.82294: done getting variables 18714 1726853409.82346: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:30:09 -0400 (0:00:00.023) 0:00:06.207 ****** 18714 1726853409.82378: entering _queue_task() for managed_node1/fail 18714 1726853409.82620: worker is 1 (out of 1 available) 18714 1726853409.82633: exiting _queue_task() for managed_node1/fail 18714 1726853409.82643: done queuing things up, now waiting for results queue to drain 18714 1726853409.82644: waiting for pending results... 18714 1726853409.82886: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 18714 1726853409.82982: in run() - task 02083763-bbaf-e784-4f7d-000000000132 18714 1726853409.83005: variable 'ansible_search_path' from source: unknown 18714 1726853409.83012: variable 'ansible_search_path' from source: unknown 18714 1726853409.83053: calling self._execute() 18714 1726853409.83137: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.83148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.83164: variable 'omit' from source: magic vars 18714 1726853409.83519: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.83537: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.83876: variable 'type' from source: set_fact 18714 1726853409.83879: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 18714 1726853409.83882: when evaluation is False, skipping this task 18714 1726853409.83884: _execute() done 18714 1726853409.83886: dumping result to json 18714 1726853409.83888: done dumping result, returning 18714 1726853409.83890: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-e784-4f7d-000000000132] 18714 1726853409.83892: sending task result for task 02083763-bbaf-e784-4f7d-000000000132 18714 1726853409.83956: done sending task result for task 02083763-bbaf-e784-4f7d-000000000132 18714 1726853409.83959: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 18714 1726853409.84006: no more pending results, returning what we have 18714 1726853409.84009: results queue empty 18714 1726853409.84010: checking for any_errors_fatal 18714 1726853409.84016: done checking for any_errors_fatal 18714 1726853409.84017: checking for max_fail_percentage 18714 1726853409.84018: done checking for max_fail_percentage 18714 1726853409.84019: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.84020: done checking to see if all hosts have failed 18714 1726853409.84021: getting the remaining hosts for this loop 18714 1726853409.84022: done getting the remaining hosts for this loop 18714 1726853409.84025: getting the next task for host managed_node1 18714 1726853409.84031: done getting next task for host managed_node1 18714 1726853409.84034: ^ task is: TASK: Include the task 'show_interfaces.yml' 18714 1726853409.84037: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.84040: getting variables 18714 1726853409.84042: in VariableManager get_vars() 18714 1726853409.84076: Calling all_inventory to load vars for managed_node1 18714 1726853409.84079: Calling groups_inventory to load vars for managed_node1 18714 1726853409.84083: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.84095: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.84098: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.84102: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.84379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.84599: done with get_vars() 18714 1726853409.84608: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:30:09 -0400 (0:00:00.023) 0:00:06.230 ****** 18714 1726853409.84698: entering _queue_task() for managed_node1/include_tasks 18714 1726853409.84953: worker is 1 (out of 1 available) 18714 1726853409.84965: exiting _queue_task() for managed_node1/include_tasks 18714 1726853409.84978: done queuing things up, now waiting for results queue to drain 18714 1726853409.84979: waiting for pending results... 18714 1726853409.85216: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18714 1726853409.85317: in run() - task 02083763-bbaf-e784-4f7d-000000000133 18714 1726853409.85376: variable 'ansible_search_path' from source: unknown 18714 1726853409.85380: variable 'ansible_search_path' from source: unknown 18714 1726853409.85382: calling self._execute() 18714 1726853409.85465: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.85478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.85492: variable 'omit' from source: magic vars 18714 1726853409.85838: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.85857: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.85885: _execute() done 18714 1726853409.85888: dumping result to json 18714 1726853409.85890: done dumping result, returning 18714 1726853409.85893: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-e784-4f7d-000000000133] 18714 1726853409.85897: sending task result for task 02083763-bbaf-e784-4f7d-000000000133 18714 1726853409.86014: no more pending results, returning what we have 18714 1726853409.86019: in VariableManager get_vars() 18714 1726853409.86055: Calling all_inventory to load vars for managed_node1 18714 1726853409.86058: Calling groups_inventory to load vars for managed_node1 18714 1726853409.86062: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.86077: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.86080: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.86083: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.86376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.86685: done with get_vars() 18714 1726853409.86693: variable 'ansible_search_path' from source: unknown 18714 1726853409.86694: variable 'ansible_search_path' from source: unknown 18714 1726853409.86707: done sending task result for task 02083763-bbaf-e784-4f7d-000000000133 18714 1726853409.86710: WORKER PROCESS EXITING 18714 1726853409.86737: we have included files to process 18714 1726853409.86738: generating all_blocks data 18714 1726853409.86739: done generating all_blocks data 18714 1726853409.86744: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18714 1726853409.86745: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18714 1726853409.86747: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18714 1726853409.86847: in VariableManager get_vars() 18714 1726853409.86867: done with get_vars() 18714 1726853409.86974: done processing included file 18714 1726853409.86976: iterating over new_blocks loaded from include file 18714 1726853409.86978: in VariableManager get_vars() 18714 1726853409.86989: done with get_vars() 18714 1726853409.86991: filtering new block on tags 18714 1726853409.87007: done filtering new block on tags 18714 1726853409.87009: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 18714 1726853409.87013: extending task lists for all hosts with included blocks 18714 1726853409.87407: done extending task lists 18714 1726853409.87408: done processing included files 18714 1726853409.87409: results queue empty 18714 1726853409.87410: checking for any_errors_fatal 18714 1726853409.87413: done checking for any_errors_fatal 18714 1726853409.87413: checking for max_fail_percentage 18714 1726853409.87414: done checking for max_fail_percentage 18714 1726853409.87415: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.87416: done checking to see if all hosts have failed 18714 1726853409.87417: getting the remaining hosts for this loop 18714 1726853409.87418: done getting the remaining hosts for this loop 18714 1726853409.87420: getting the next task for host managed_node1 18714 1726853409.87424: done getting next task for host managed_node1 18714 1726853409.87427: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18714 1726853409.87430: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.87432: getting variables 18714 1726853409.87433: in VariableManager get_vars() 18714 1726853409.87441: Calling all_inventory to load vars for managed_node1 18714 1726853409.87443: Calling groups_inventory to load vars for managed_node1 18714 1726853409.87445: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.87453: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.87455: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.87459: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.87622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.87812: done with get_vars() 18714 1726853409.87821: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:30:09 -0400 (0:00:00.031) 0:00:06.262 ****** 18714 1726853409.87894: entering _queue_task() for managed_node1/include_tasks 18714 1726853409.88147: worker is 1 (out of 1 available) 18714 1726853409.88161: exiting _queue_task() for managed_node1/include_tasks 18714 1726853409.88376: done queuing things up, now waiting for results queue to drain 18714 1726853409.88378: waiting for pending results... 18714 1726853409.88419: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 18714 1726853409.88521: in run() - task 02083763-bbaf-e784-4f7d-00000000015c 18714 1726853409.88541: variable 'ansible_search_path' from source: unknown 18714 1726853409.88547: variable 'ansible_search_path' from source: unknown 18714 1726853409.88588: calling self._execute() 18714 1726853409.88668: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.88682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.88695: variable 'omit' from source: magic vars 18714 1726853409.89045: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.89064: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.89076: _execute() done 18714 1726853409.89084: dumping result to json 18714 1726853409.89090: done dumping result, returning 18714 1726853409.89099: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-e784-4f7d-00000000015c] 18714 1726853409.89107: sending task result for task 02083763-bbaf-e784-4f7d-00000000015c 18714 1726853409.89276: done sending task result for task 02083763-bbaf-e784-4f7d-00000000015c 18714 1726853409.89280: WORKER PROCESS EXITING 18714 1726853409.89307: no more pending results, returning what we have 18714 1726853409.89312: in VariableManager get_vars() 18714 1726853409.89345: Calling all_inventory to load vars for managed_node1 18714 1726853409.89348: Calling groups_inventory to load vars for managed_node1 18714 1726853409.89354: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.89367: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.89372: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.89375: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.89696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.89913: done with get_vars() 18714 1726853409.89920: variable 'ansible_search_path' from source: unknown 18714 1726853409.89921: variable 'ansible_search_path' from source: unknown 18714 1726853409.89980: we have included files to process 18714 1726853409.89981: generating all_blocks data 18714 1726853409.89983: done generating all_blocks data 18714 1726853409.89984: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18714 1726853409.89985: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18714 1726853409.89987: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18714 1726853409.90240: done processing included file 18714 1726853409.90242: iterating over new_blocks loaded from include file 18714 1726853409.90244: in VariableManager get_vars() 18714 1726853409.90259: done with get_vars() 18714 1726853409.90260: filtering new block on tags 18714 1726853409.90279: done filtering new block on tags 18714 1726853409.90281: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 18714 1726853409.90285: extending task lists for all hosts with included blocks 18714 1726853409.90425: done extending task lists 18714 1726853409.90426: done processing included files 18714 1726853409.90427: results queue empty 18714 1726853409.90428: checking for any_errors_fatal 18714 1726853409.90430: done checking for any_errors_fatal 18714 1726853409.90431: checking for max_fail_percentage 18714 1726853409.90432: done checking for max_fail_percentage 18714 1726853409.90433: checking to see if all hosts have failed and the running result is not ok 18714 1726853409.90433: done checking to see if all hosts have failed 18714 1726853409.90434: getting the remaining hosts for this loop 18714 1726853409.90435: done getting the remaining hosts for this loop 18714 1726853409.90438: getting the next task for host managed_node1 18714 1726853409.90442: done getting next task for host managed_node1 18714 1726853409.90444: ^ task is: TASK: Gather current interface info 18714 1726853409.90447: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853409.90452: getting variables 18714 1726853409.90453: in VariableManager get_vars() 18714 1726853409.90462: Calling all_inventory to load vars for managed_node1 18714 1726853409.90464: Calling groups_inventory to load vars for managed_node1 18714 1726853409.90466: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853409.90474: Calling all_plugins_play to load vars for managed_node1 18714 1726853409.90476: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853409.90480: Calling groups_plugins_play to load vars for managed_node1 18714 1726853409.90617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853409.90809: done with get_vars() 18714 1726853409.90817: done getting variables 18714 1726853409.90855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:30:09 -0400 (0:00:00.029) 0:00:06.292 ****** 18714 1726853409.90887: entering _queue_task() for managed_node1/command 18714 1726853409.91145: worker is 1 (out of 1 available) 18714 1726853409.91160: exiting _queue_task() for managed_node1/command 18714 1726853409.91376: done queuing things up, now waiting for results queue to drain 18714 1726853409.91377: waiting for pending results... 18714 1726853409.91504: running TaskExecutor() for managed_node1/TASK: Gather current interface info 18714 1726853409.91530: in run() - task 02083763-bbaf-e784-4f7d-000000000193 18714 1726853409.91551: variable 'ansible_search_path' from source: unknown 18714 1726853409.91559: variable 'ansible_search_path' from source: unknown 18714 1726853409.91603: calling self._execute() 18714 1726853409.91684: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.91694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.91711: variable 'omit' from source: magic vars 18714 1726853409.92125: variable 'ansible_distribution_major_version' from source: facts 18714 1726853409.92145: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853409.92158: variable 'omit' from source: magic vars 18714 1726853409.92214: variable 'omit' from source: magic vars 18714 1726853409.92259: variable 'omit' from source: magic vars 18714 1726853409.92303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853409.92341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853409.92468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853409.92474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.92476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853409.92478: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853409.92480: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.92482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.92554: Set connection var ansible_shell_executable to /bin/sh 18714 1726853409.92565: Set connection var ansible_timeout to 10 18714 1726853409.92579: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853409.92591: Set connection var ansible_connection to ssh 18714 1726853409.92600: Set connection var ansible_shell_type to sh 18714 1726853409.92608: Set connection var ansible_pipelining to False 18714 1726853409.92632: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.92639: variable 'ansible_connection' from source: unknown 18714 1726853409.92646: variable 'ansible_module_compression' from source: unknown 18714 1726853409.92655: variable 'ansible_shell_type' from source: unknown 18714 1726853409.92661: variable 'ansible_shell_executable' from source: unknown 18714 1726853409.92667: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853409.92676: variable 'ansible_pipelining' from source: unknown 18714 1726853409.92686: variable 'ansible_timeout' from source: unknown 18714 1726853409.92693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853409.92835: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853409.92852: variable 'omit' from source: magic vars 18714 1726853409.92862: starting attempt loop 18714 1726853409.92868: running the handler 18714 1726853409.92899: _low_level_execute_command(): starting 18714 1726853409.92902: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853409.93636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853409.93673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853409.93772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853409.93787: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853409.93804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.93824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.93910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.95598: stdout chunk (state=3): >>>/root <<< 18714 1726853409.95742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.95755: stdout chunk (state=3): >>><<< 18714 1726853409.95773: stderr chunk (state=3): >>><<< 18714 1726853409.95797: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.95816: _low_level_execute_command(): starting 18714 1726853409.95827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058 `" && echo ansible-tmp-1726853409.958039-19014-42771032457058="` echo /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058 `" ) && sleep 0' 18714 1726853409.96472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853409.96487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.96502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853409.96519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853409.96624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853409.96685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853409.96734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853409.98648: stdout chunk (state=3): >>>ansible-tmp-1726853409.958039-19014-42771032457058=/root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058 <<< 18714 1726853409.98807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853409.98811: stdout chunk (state=3): >>><<< 18714 1726853409.98814: stderr chunk (state=3): >>><<< 18714 1726853409.98976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853409.958039-19014-42771032457058=/root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853409.98979: variable 'ansible_module_compression' from source: unknown 18714 1726853409.98982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853409.98984: variable 'ansible_facts' from source: unknown 18714 1726853409.99068: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py 18714 1726853409.99228: Sending initial data 18714 1726853409.99238: Sent initial data (154 bytes) 18714 1726853409.99877: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853409.99902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853409.99921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.00011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.00066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.00081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.00154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.01689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853410.01720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853410.01762: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp_ljta0zr /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py <<< 18714 1726853410.01764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py" <<< 18714 1726853410.01803: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp_ljta0zr" to remote "/root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py" <<< 18714 1726853410.01806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py" <<< 18714 1726853410.02743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.02751: stdout chunk (state=3): >>><<< 18714 1726853410.02754: stderr chunk (state=3): >>><<< 18714 1726853410.02756: done transferring module to remote 18714 1726853410.02758: _low_level_execute_command(): starting 18714 1726853410.02760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/ /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py && sleep 0' 18714 1726853410.03238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853410.03246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853410.03281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.03285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853410.03287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.03289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853410.03297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.03338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.03354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.03390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.05389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.05393: stdout chunk (state=3): >>><<< 18714 1726853410.05395: stderr chunk (state=3): >>><<< 18714 1726853410.05398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853410.05400: _low_level_execute_command(): starting 18714 1726853410.05403: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/AnsiballZ_command.py && sleep 0' 18714 1726853410.06374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.06441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.06456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853410.06509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.06579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.22077: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:30:10.216600", "end": "2024-09-20 13:30:10.219999", "delta": "0:00:00.003399", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853410.23758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853410.23762: stdout chunk (state=3): >>><<< 18714 1726853410.23765: stderr chunk (state=3): >>><<< 18714 1726853410.23767: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:30:10.216600", "end": "2024-09-20 13:30:10.219999", "delta": "0:00:00.003399", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853410.23769: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853410.23774: _low_level_execute_command(): starting 18714 1726853410.23776: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853409.958039-19014-42771032457058/ > /dev/null 2>&1 && sleep 0' 18714 1726853410.24458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.24473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.24524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853410.24542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.24586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.26384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.26439: stderr chunk (state=3): >>><<< 18714 1726853410.26443: stdout chunk (state=3): >>><<< 18714 1726853410.26446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853410.26676: handler run complete 18714 1726853410.26679: Evaluated conditional (False): False 18714 1726853410.26681: attempt loop complete, returning result 18714 1726853410.26683: _execute() done 18714 1726853410.26685: dumping result to json 18714 1726853410.26687: done dumping result, returning 18714 1726853410.26688: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-e784-4f7d-000000000193] 18714 1726853410.26690: sending task result for task 02083763-bbaf-e784-4f7d-000000000193 18714 1726853410.26754: done sending task result for task 02083763-bbaf-e784-4f7d-000000000193 18714 1726853410.26757: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003399", "end": "2024-09-20 13:30:10.219999", "rc": 0, "start": "2024-09-20 13:30:10.216600" } STDOUT: bonding_masters eth0 lo 18714 1726853410.26824: no more pending results, returning what we have 18714 1726853410.26828: results queue empty 18714 1726853410.26829: checking for any_errors_fatal 18714 1726853410.26830: done checking for any_errors_fatal 18714 1726853410.26830: checking for max_fail_percentage 18714 1726853410.26832: done checking for max_fail_percentage 18714 1726853410.26833: checking to see if all hosts have failed and the running result is not ok 18714 1726853410.26833: done checking to see if all hosts have failed 18714 1726853410.26834: getting the remaining hosts for this loop 18714 1726853410.26835: done getting the remaining hosts for this loop 18714 1726853410.26838: getting the next task for host managed_node1 18714 1726853410.26844: done getting next task for host managed_node1 18714 1726853410.26846: ^ task is: TASK: Set current_interfaces 18714 1726853410.26850: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853410.26853: getting variables 18714 1726853410.26854: in VariableManager get_vars() 18714 1726853410.26927: Calling all_inventory to load vars for managed_node1 18714 1726853410.26930: Calling groups_inventory to load vars for managed_node1 18714 1726853410.26937: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853410.26948: Calling all_plugins_play to load vars for managed_node1 18714 1726853410.26951: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853410.26954: Calling groups_plugins_play to load vars for managed_node1 18714 1726853410.27112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853410.27299: done with get_vars() 18714 1726853410.27309: done getting variables 18714 1726853410.27362: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:30:10 -0400 (0:00:00.365) 0:00:06.657 ****** 18714 1726853410.27399: entering _queue_task() for managed_node1/set_fact 18714 1726853410.27662: worker is 1 (out of 1 available) 18714 1726853410.27676: exiting _queue_task() for managed_node1/set_fact 18714 1726853410.27687: done queuing things up, now waiting for results queue to drain 18714 1726853410.27687: waiting for pending results... 18714 1726853410.27930: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 18714 1726853410.28026: in run() - task 02083763-bbaf-e784-4f7d-000000000194 18714 1726853410.28040: variable 'ansible_search_path' from source: unknown 18714 1726853410.28044: variable 'ansible_search_path' from source: unknown 18714 1726853410.28086: calling self._execute() 18714 1726853410.28202: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.28205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.28209: variable 'omit' from source: magic vars 18714 1726853410.28576: variable 'ansible_distribution_major_version' from source: facts 18714 1726853410.28580: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853410.28584: variable 'omit' from source: magic vars 18714 1726853410.28616: variable 'omit' from source: magic vars 18714 1726853410.28724: variable '_current_interfaces' from source: set_fact 18714 1726853410.28793: variable 'omit' from source: magic vars 18714 1726853410.28836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853410.28879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853410.28903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853410.28930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853410.28976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853410.28989: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853410.28998: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.29006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.29119: Set connection var ansible_shell_executable to /bin/sh 18714 1726853410.29139: Set connection var ansible_timeout to 10 18714 1726853410.29176: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853410.29179: Set connection var ansible_connection to ssh 18714 1726853410.29182: Set connection var ansible_shell_type to sh 18714 1726853410.29184: Set connection var ansible_pipelining to False 18714 1726853410.29208: variable 'ansible_shell_executable' from source: unknown 18714 1726853410.29211: variable 'ansible_connection' from source: unknown 18714 1726853410.29214: variable 'ansible_module_compression' from source: unknown 18714 1726853410.29216: variable 'ansible_shell_type' from source: unknown 18714 1726853410.29263: variable 'ansible_shell_executable' from source: unknown 18714 1726853410.29266: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.29268: variable 'ansible_pipelining' from source: unknown 18714 1726853410.29270: variable 'ansible_timeout' from source: unknown 18714 1726853410.29275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.29344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853410.29355: variable 'omit' from source: magic vars 18714 1726853410.29364: starting attempt loop 18714 1726853410.29367: running the handler 18714 1726853410.29378: handler run complete 18714 1726853410.29385: attempt loop complete, returning result 18714 1726853410.29388: _execute() done 18714 1726853410.29390: dumping result to json 18714 1726853410.29393: done dumping result, returning 18714 1726853410.29407: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-e784-4f7d-000000000194] 18714 1726853410.29410: sending task result for task 02083763-bbaf-e784-4f7d-000000000194 18714 1726853410.29485: done sending task result for task 02083763-bbaf-e784-4f7d-000000000194 18714 1726853410.29488: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18714 1726853410.29542: no more pending results, returning what we have 18714 1726853410.29545: results queue empty 18714 1726853410.29546: checking for any_errors_fatal 18714 1726853410.29551: done checking for any_errors_fatal 18714 1726853410.29552: checking for max_fail_percentage 18714 1726853410.29554: done checking for max_fail_percentage 18714 1726853410.29555: checking to see if all hosts have failed and the running result is not ok 18714 1726853410.29555: done checking to see if all hosts have failed 18714 1726853410.29556: getting the remaining hosts for this loop 18714 1726853410.29557: done getting the remaining hosts for this loop 18714 1726853410.29560: getting the next task for host managed_node1 18714 1726853410.29568: done getting next task for host managed_node1 18714 1726853410.29570: ^ task is: TASK: Show current_interfaces 18714 1726853410.29575: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853410.29578: getting variables 18714 1726853410.29580: in VariableManager get_vars() 18714 1726853410.29607: Calling all_inventory to load vars for managed_node1 18714 1726853410.29610: Calling groups_inventory to load vars for managed_node1 18714 1726853410.29613: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853410.29621: Calling all_plugins_play to load vars for managed_node1 18714 1726853410.29623: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853410.29625: Calling groups_plugins_play to load vars for managed_node1 18714 1726853410.29744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853410.29875: done with get_vars() 18714 1726853410.29882: done getting variables 18714 1726853410.29921: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:30:10 -0400 (0:00:00.025) 0:00:06.682 ****** 18714 1726853410.29943: entering _queue_task() for managed_node1/debug 18714 1726853410.30126: worker is 1 (out of 1 available) 18714 1726853410.30138: exiting _queue_task() for managed_node1/debug 18714 1726853410.30148: done queuing things up, now waiting for results queue to drain 18714 1726853410.30149: waiting for pending results... 18714 1726853410.30291: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 18714 1726853410.30346: in run() - task 02083763-bbaf-e784-4f7d-00000000015d 18714 1726853410.30359: variable 'ansible_search_path' from source: unknown 18714 1726853410.30363: variable 'ansible_search_path' from source: unknown 18714 1726853410.30393: calling self._execute() 18714 1726853410.30445: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.30449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.30463: variable 'omit' from source: magic vars 18714 1726853410.30711: variable 'ansible_distribution_major_version' from source: facts 18714 1726853410.30720: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853410.30725: variable 'omit' from source: magic vars 18714 1726853410.30755: variable 'omit' from source: magic vars 18714 1726853410.30821: variable 'current_interfaces' from source: set_fact 18714 1726853410.30840: variable 'omit' from source: magic vars 18714 1726853410.30873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853410.30897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853410.30912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853410.30928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853410.30937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853410.30962: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853410.30965: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.30968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.31035: Set connection var ansible_shell_executable to /bin/sh 18714 1726853410.31041: Set connection var ansible_timeout to 10 18714 1726853410.31046: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853410.31055: Set connection var ansible_connection to ssh 18714 1726853410.31060: Set connection var ansible_shell_type to sh 18714 1726853410.31064: Set connection var ansible_pipelining to False 18714 1726853410.31276: variable 'ansible_shell_executable' from source: unknown 18714 1726853410.31279: variable 'ansible_connection' from source: unknown 18714 1726853410.31282: variable 'ansible_module_compression' from source: unknown 18714 1726853410.31284: variable 'ansible_shell_type' from source: unknown 18714 1726853410.31285: variable 'ansible_shell_executable' from source: unknown 18714 1726853410.31287: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.31289: variable 'ansible_pipelining' from source: unknown 18714 1726853410.31291: variable 'ansible_timeout' from source: unknown 18714 1726853410.31293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.31295: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853410.31297: variable 'omit' from source: magic vars 18714 1726853410.31299: starting attempt loop 18714 1726853410.31303: running the handler 18714 1726853410.31336: handler run complete 18714 1726853410.31360: attempt loop complete, returning result 18714 1726853410.31368: _execute() done 18714 1726853410.31378: dumping result to json 18714 1726853410.31386: done dumping result, returning 18714 1726853410.31398: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-e784-4f7d-00000000015d] 18714 1726853410.31406: sending task result for task 02083763-bbaf-e784-4f7d-00000000015d ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18714 1726853410.31539: no more pending results, returning what we have 18714 1726853410.31542: results queue empty 18714 1726853410.31543: checking for any_errors_fatal 18714 1726853410.31553: done checking for any_errors_fatal 18714 1726853410.31553: checking for max_fail_percentage 18714 1726853410.31555: done checking for max_fail_percentage 18714 1726853410.31556: checking to see if all hosts have failed and the running result is not ok 18714 1726853410.31556: done checking to see if all hosts have failed 18714 1726853410.31557: getting the remaining hosts for this loop 18714 1726853410.31558: done getting the remaining hosts for this loop 18714 1726853410.31563: getting the next task for host managed_node1 18714 1726853410.31573: done getting next task for host managed_node1 18714 1726853410.31576: ^ task is: TASK: Install iproute 18714 1726853410.31579: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853410.31584: getting variables 18714 1726853410.31585: in VariableManager get_vars() 18714 1726853410.31614: Calling all_inventory to load vars for managed_node1 18714 1726853410.31616: Calling groups_inventory to load vars for managed_node1 18714 1726853410.31620: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853410.31630: Calling all_plugins_play to load vars for managed_node1 18714 1726853410.31632: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853410.31635: Calling groups_plugins_play to load vars for managed_node1 18714 1726853410.32027: done sending task result for task 02083763-bbaf-e784-4f7d-00000000015d 18714 1726853410.32031: WORKER PROCESS EXITING 18714 1726853410.32057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853410.32227: done with get_vars() 18714 1726853410.32234: done getting variables 18714 1726853410.32276: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:30:10 -0400 (0:00:00.023) 0:00:06.706 ****** 18714 1726853410.32297: entering _queue_task() for managed_node1/package 18714 1726853410.32478: worker is 1 (out of 1 available) 18714 1726853410.32491: exiting _queue_task() for managed_node1/package 18714 1726853410.32503: done queuing things up, now waiting for results queue to drain 18714 1726853410.32504: waiting for pending results... 18714 1726853410.32657: running TaskExecutor() for managed_node1/TASK: Install iproute 18714 1726853410.32718: in run() - task 02083763-bbaf-e784-4f7d-000000000134 18714 1726853410.32729: variable 'ansible_search_path' from source: unknown 18714 1726853410.32734: variable 'ansible_search_path' from source: unknown 18714 1726853410.32765: calling self._execute() 18714 1726853410.32825: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.32828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.32839: variable 'omit' from source: magic vars 18714 1726853410.33099: variable 'ansible_distribution_major_version' from source: facts 18714 1726853410.33108: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853410.33113: variable 'omit' from source: magic vars 18714 1726853410.33138: variable 'omit' from source: magic vars 18714 1726853410.33263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853410.34835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853410.34878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853410.34910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853410.34931: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853410.34960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853410.35029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853410.35049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853410.35068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853410.35097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853410.35108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853410.35177: variable '__network_is_ostree' from source: set_fact 18714 1726853410.35180: variable 'omit' from source: magic vars 18714 1726853410.35202: variable 'omit' from source: magic vars 18714 1726853410.35221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853410.35244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853410.35258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853410.35273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853410.35281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853410.35304: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853410.35307: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.35309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.35377: Set connection var ansible_shell_executable to /bin/sh 18714 1726853410.35383: Set connection var ansible_timeout to 10 18714 1726853410.35388: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853410.35395: Set connection var ansible_connection to ssh 18714 1726853410.35400: Set connection var ansible_shell_type to sh 18714 1726853410.35404: Set connection var ansible_pipelining to False 18714 1726853410.35420: variable 'ansible_shell_executable' from source: unknown 18714 1726853410.35422: variable 'ansible_connection' from source: unknown 18714 1726853410.35425: variable 'ansible_module_compression' from source: unknown 18714 1726853410.35428: variable 'ansible_shell_type' from source: unknown 18714 1726853410.35430: variable 'ansible_shell_executable' from source: unknown 18714 1726853410.35432: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853410.35437: variable 'ansible_pipelining' from source: unknown 18714 1726853410.35439: variable 'ansible_timeout' from source: unknown 18714 1726853410.35453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853410.35515: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853410.35522: variable 'omit' from source: magic vars 18714 1726853410.35527: starting attempt loop 18714 1726853410.35530: running the handler 18714 1726853410.35535: variable 'ansible_facts' from source: unknown 18714 1726853410.35537: variable 'ansible_facts' from source: unknown 18714 1726853410.35565: _low_level_execute_command(): starting 18714 1726853410.35573: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853410.36061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.36065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.36068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853410.36070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853410.36075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.36117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853410.36132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.36170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.37793: stdout chunk (state=3): >>>/root <<< 18714 1726853410.37897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.37924: stderr chunk (state=3): >>><<< 18714 1726853410.37927: stdout chunk (state=3): >>><<< 18714 1726853410.37950: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853410.37962: _low_level_execute_command(): starting 18714 1726853410.37968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847 `" && echo ansible-tmp-1726853410.3795094-19047-101447887506847="` echo /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847 `" ) && sleep 0' 18714 1726853410.38401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.38404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.38406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853410.38409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853410.38411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.38464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.38467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853410.38468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.38503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.40363: stdout chunk (state=3): >>>ansible-tmp-1726853410.3795094-19047-101447887506847=/root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847 <<< 18714 1726853410.40476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.40499: stderr chunk (state=3): >>><<< 18714 1726853410.40502: stdout chunk (state=3): >>><<< 18714 1726853410.40516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853410.3795094-19047-101447887506847=/root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853410.40540: variable 'ansible_module_compression' from source: unknown 18714 1726853410.40587: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 18714 1726853410.40591: ANSIBALLZ: Acquiring lock 18714 1726853410.40594: ANSIBALLZ: Lock acquired: 139791971422656 18714 1726853410.40596: ANSIBALLZ: Creating module 18714 1726853410.50306: ANSIBALLZ: Writing module into payload 18714 1726853410.50440: ANSIBALLZ: Writing module 18714 1726853410.50463: ANSIBALLZ: Renaming module 18714 1726853410.50475: ANSIBALLZ: Done creating module 18714 1726853410.50491: variable 'ansible_facts' from source: unknown 18714 1726853410.50544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py 18714 1726853410.50645: Sending initial data 18714 1726853410.50649: Sent initial data (152 bytes) 18714 1726853410.51109: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.51113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853410.51115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853410.51117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853410.51119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.51176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.51179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853410.51185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.51230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.52875: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18714 1726853410.52879: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853410.52909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853410.52948: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpz5ne8pl5 /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py <<< 18714 1726853410.52960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py" <<< 18714 1726853410.52988: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpz5ne8pl5" to remote "/root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py" <<< 18714 1726853410.52995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py" <<< 18714 1726853410.53639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.53683: stderr chunk (state=3): >>><<< 18714 1726853410.53686: stdout chunk (state=3): >>><<< 18714 1726853410.53726: done transferring module to remote 18714 1726853410.53735: _low_level_execute_command(): starting 18714 1726853410.53740: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/ /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py && sleep 0' 18714 1726853410.54181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853410.54184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.54187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.54189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.54236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.54239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.54290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.56039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853410.56064: stderr chunk (state=3): >>><<< 18714 1726853410.56067: stdout chunk (state=3): >>><<< 18714 1726853410.56082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853410.56085: _low_level_execute_command(): starting 18714 1726853410.56090: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/AnsiballZ_dnf.py && sleep 0' 18714 1726853410.56518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853410.56522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853410.56524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.56526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853410.56528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853410.56577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853410.56587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853410.56633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853410.97911: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 18714 1726853411.01885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853411.01890: stdout chunk (state=3): >>><<< 18714 1726853411.01892: stderr chunk (state=3): >>><<< 18714 1726853411.02118: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853411.02127: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853411.02130: _low_level_execute_command(): starting 18714 1726853411.02132: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853410.3795094-19047-101447887506847/ > /dev/null 2>&1 && sleep 0' 18714 1726853411.03095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.03379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.03402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.03420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.03493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.05389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.05402: stdout chunk (state=3): >>><<< 18714 1726853411.05415: stderr chunk (state=3): >>><<< 18714 1726853411.05440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.05777: handler run complete 18714 1726853411.05780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853411.06127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853411.06413: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853411.06576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853411.06579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853411.06582: variable '__install_status' from source: unknown 18714 1726853411.06584: Evaluated conditional (__install_status is success): True 18714 1726853411.06586: attempt loop complete, returning result 18714 1726853411.06595: _execute() done 18714 1726853411.06603: dumping result to json 18714 1726853411.06613: done dumping result, returning 18714 1726853411.06625: done running TaskExecutor() for managed_node1/TASK: Install iproute [02083763-bbaf-e784-4f7d-000000000134] 18714 1726853411.06707: sending task result for task 02083763-bbaf-e784-4f7d-000000000134 ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 18714 1726853411.06904: no more pending results, returning what we have 18714 1726853411.06908: results queue empty 18714 1726853411.06909: checking for any_errors_fatal 18714 1726853411.06913: done checking for any_errors_fatal 18714 1726853411.06914: checking for max_fail_percentage 18714 1726853411.06916: done checking for max_fail_percentage 18714 1726853411.06917: checking to see if all hosts have failed and the running result is not ok 18714 1726853411.06918: done checking to see if all hosts have failed 18714 1726853411.06919: getting the remaining hosts for this loop 18714 1726853411.06920: done getting the remaining hosts for this loop 18714 1726853411.06924: getting the next task for host managed_node1 18714 1726853411.06932: done getting next task for host managed_node1 18714 1726853411.06935: ^ task is: TASK: Create veth interface {{ interface }} 18714 1726853411.06937: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853411.06941: getting variables 18714 1726853411.06943: in VariableManager get_vars() 18714 1726853411.07176: Calling all_inventory to load vars for managed_node1 18714 1726853411.07179: Calling groups_inventory to load vars for managed_node1 18714 1726853411.07183: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853411.07196: Calling all_plugins_play to load vars for managed_node1 18714 1726853411.07199: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853411.07202: Calling groups_plugins_play to load vars for managed_node1 18714 1726853411.08304: done sending task result for task 02083763-bbaf-e784-4f7d-000000000134 18714 1726853411.08308: WORKER PROCESS EXITING 18714 1726853411.08388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853411.08783: done with get_vars() 18714 1726853411.08796: done getting variables 18714 1726853411.08853: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853411.09180: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:30:11 -0400 (0:00:00.769) 0:00:07.475 ****** 18714 1726853411.09218: entering _queue_task() for managed_node1/command 18714 1726853411.09956: worker is 1 (out of 1 available) 18714 1726853411.09968: exiting _queue_task() for managed_node1/command 18714 1726853411.09982: done queuing things up, now waiting for results queue to drain 18714 1726853411.09983: waiting for pending results... 18714 1726853411.10428: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 18714 1726853411.10520: in run() - task 02083763-bbaf-e784-4f7d-000000000135 18714 1726853411.10535: variable 'ansible_search_path' from source: unknown 18714 1726853411.10538: variable 'ansible_search_path' from source: unknown 18714 1726853411.11218: variable 'interface' from source: set_fact 18714 1726853411.11303: variable 'interface' from source: set_fact 18714 1726853411.11552: variable 'interface' from source: set_fact 18714 1726853411.11976: Loaded config def from plugin (lookup/items) 18714 1726853411.11981: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 18714 1726853411.11983: variable 'omit' from source: magic vars 18714 1726853411.12074: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.12191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.12209: variable 'omit' from source: magic vars 18714 1726853411.12543: variable 'ansible_distribution_major_version' from source: facts 18714 1726853411.12878: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853411.13376: variable 'type' from source: set_fact 18714 1726853411.13380: variable 'state' from source: include params 18714 1726853411.13382: variable 'interface' from source: set_fact 18714 1726853411.13384: variable 'current_interfaces' from source: set_fact 18714 1726853411.13386: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18714 1726853411.13389: variable 'omit' from source: magic vars 18714 1726853411.13391: variable 'omit' from source: magic vars 18714 1726853411.13394: variable 'item' from source: unknown 18714 1726853411.13591: variable 'item' from source: unknown 18714 1726853411.13613: variable 'omit' from source: magic vars 18714 1726853411.13654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853411.13693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853411.13975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853411.13980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853411.13983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853411.13985: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853411.13988: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.13990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.14084: Set connection var ansible_shell_executable to /bin/sh 18714 1726853411.14376: Set connection var ansible_timeout to 10 18714 1726853411.14379: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853411.14381: Set connection var ansible_connection to ssh 18714 1726853411.14384: Set connection var ansible_shell_type to sh 18714 1726853411.14386: Set connection var ansible_pipelining to False 18714 1726853411.14388: variable 'ansible_shell_executable' from source: unknown 18714 1726853411.14391: variable 'ansible_connection' from source: unknown 18714 1726853411.14394: variable 'ansible_module_compression' from source: unknown 18714 1726853411.14396: variable 'ansible_shell_type' from source: unknown 18714 1726853411.14398: variable 'ansible_shell_executable' from source: unknown 18714 1726853411.14400: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.14402: variable 'ansible_pipelining' from source: unknown 18714 1726853411.14404: variable 'ansible_timeout' from source: unknown 18714 1726853411.14406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.14521: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853411.14690: variable 'omit' from source: magic vars 18714 1726853411.14702: starting attempt loop 18714 1726853411.14709: running the handler 18714 1726853411.14728: _low_level_execute_command(): starting 18714 1726853411.14742: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853411.15652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.15675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853411.15693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.15711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853411.15728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853411.15739: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853411.15755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.15776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853411.15788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853411.15799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853411.15881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.15904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.15920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.15997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.17822: stdout chunk (state=3): >>>/root <<< 18714 1726853411.18215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.18219: stdout chunk (state=3): >>><<< 18714 1726853411.18221: stderr chunk (state=3): >>><<< 18714 1726853411.18224: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.18236: _low_level_execute_command(): starting 18714 1726853411.18239: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754 `" && echo ansible-tmp-1726853411.1812394-19061-168339218741754="` echo /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754 `" ) && sleep 0' 18714 1726853411.19501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.19592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.19643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.19668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.19701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.19804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.21764: stdout chunk (state=3): >>>ansible-tmp-1726853411.1812394-19061-168339218741754=/root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754 <<< 18714 1726853411.22120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.22124: stdout chunk (state=3): >>><<< 18714 1726853411.22126: stderr chunk (state=3): >>><<< 18714 1726853411.22159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853411.1812394-19061-168339218741754=/root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.22377: variable 'ansible_module_compression' from source: unknown 18714 1726853411.22380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853411.22663: variable 'ansible_facts' from source: unknown 18714 1726853411.22797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py 18714 1726853411.23236: Sending initial data 18714 1726853411.23239: Sent initial data (156 bytes) 18714 1726853411.24069: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.24108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.24148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.24166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.24199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.24270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.25855: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853411.25932: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853411.26211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpjm7c066q /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py <<< 18714 1726853411.26215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py" <<< 18714 1726853411.26230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpjm7c066q" to remote "/root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py" <<< 18714 1726853411.27561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.27647: stderr chunk (state=3): >>><<< 18714 1726853411.27694: stdout chunk (state=3): >>><<< 18714 1726853411.27721: done transferring module to remote 18714 1726853411.27806: _low_level_execute_command(): starting 18714 1726853411.27816: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/ /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py && sleep 0' 18714 1726853411.28755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.28779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853411.28794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.28892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.28914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.28931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.29016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.30979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.30983: stdout chunk (state=3): >>><<< 18714 1726853411.30986: stderr chunk (state=3): >>><<< 18714 1726853411.30988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.30990: _low_level_execute_command(): starting 18714 1726853411.30993: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/AnsiballZ_command.py && sleep 0' 18714 1726853411.31966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.31970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.31977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.32042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.32045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.32051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.32106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.48235: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 13:30:11.473018", "end": "2024-09-20 13:30:11.478213", "delta": "0:00:00.005195", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853411.50489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853411.50493: stdout chunk (state=3): >>><<< 18714 1726853411.50495: stderr chunk (state=3): >>><<< 18714 1726853411.50498: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 13:30:11.473018", "end": "2024-09-20 13:30:11.478213", "delta": "0:00:00.005195", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853411.50500: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853411.50507: _low_level_execute_command(): starting 18714 1726853411.50509: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853411.1812394-19061-168339218741754/ > /dev/null 2>&1 && sleep 0' 18714 1726853411.51179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.51260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.51308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.51338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.51787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.55512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.55880: stderr chunk (state=3): >>><<< 18714 1726853411.55884: stdout chunk (state=3): >>><<< 18714 1726853411.55886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.55889: handler run complete 18714 1726853411.55891: Evaluated conditional (False): False 18714 1726853411.55893: attempt loop complete, returning result 18714 1726853411.55895: variable 'item' from source: unknown 18714 1726853411.55897: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005195", "end": "2024-09-20 13:30:11.478213", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 13:30:11.473018" } 18714 1726853411.56432: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.56436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.56439: variable 'omit' from source: magic vars 18714 1726853411.56878: variable 'ansible_distribution_major_version' from source: facts 18714 1726853411.56883: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853411.57159: variable 'type' from source: set_fact 18714 1726853411.57170: variable 'state' from source: include params 18714 1726853411.57181: variable 'interface' from source: set_fact 18714 1726853411.57188: variable 'current_interfaces' from source: set_fact 18714 1726853411.57198: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18714 1726853411.57207: variable 'omit' from source: magic vars 18714 1726853411.57227: variable 'omit' from source: magic vars 18714 1726853411.57676: variable 'item' from source: unknown 18714 1726853411.57679: variable 'item' from source: unknown 18714 1726853411.57681: variable 'omit' from source: magic vars 18714 1726853411.57683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853411.57686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853411.57688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853411.57689: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853411.57691: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.57693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.57764: Set connection var ansible_shell_executable to /bin/sh 18714 1726853411.58077: Set connection var ansible_timeout to 10 18714 1726853411.58081: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853411.58084: Set connection var ansible_connection to ssh 18714 1726853411.58087: Set connection var ansible_shell_type to sh 18714 1726853411.58089: Set connection var ansible_pipelining to False 18714 1726853411.58094: variable 'ansible_shell_executable' from source: unknown 18714 1726853411.58097: variable 'ansible_connection' from source: unknown 18714 1726853411.58099: variable 'ansible_module_compression' from source: unknown 18714 1726853411.58102: variable 'ansible_shell_type' from source: unknown 18714 1726853411.58105: variable 'ansible_shell_executable' from source: unknown 18714 1726853411.58107: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.58110: variable 'ansible_pipelining' from source: unknown 18714 1726853411.58112: variable 'ansible_timeout' from source: unknown 18714 1726853411.58114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.58187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853411.58201: variable 'omit' from source: magic vars 18714 1726853411.58464: starting attempt loop 18714 1726853411.58467: running the handler 18714 1726853411.58470: _low_level_execute_command(): starting 18714 1726853411.58474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853411.59566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.59585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.59747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.59862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.59931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.61687: stdout chunk (state=3): >>>/root <<< 18714 1726853411.61767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.61770: stdout chunk (state=3): >>><<< 18714 1726853411.61775: stderr chunk (state=3): >>><<< 18714 1726853411.61796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.61813: _low_level_execute_command(): starting 18714 1726853411.61824: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512 `" && echo ansible-tmp-1726853411.6180263-19061-128002399288512="` echo /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512 `" ) && sleep 0' 18714 1726853411.63206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.63223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853411.63290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.63294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.63477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.63492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.63639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.65542: stdout chunk (state=3): >>>ansible-tmp-1726853411.6180263-19061-128002399288512=/root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512 <<< 18714 1726853411.65844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.65848: stdout chunk (state=3): >>><<< 18714 1726853411.65852: stderr chunk (state=3): >>><<< 18714 1726853411.65854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853411.6180263-19061-128002399288512=/root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.65856: variable 'ansible_module_compression' from source: unknown 18714 1726853411.65858: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853411.65859: variable 'ansible_facts' from source: unknown 18714 1726853411.66020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py 18714 1726853411.66440: Sending initial data 18714 1726853411.66503: Sent initial data (156 bytes) 18714 1726853411.67584: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.67737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.67766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.69304: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18714 1726853411.69328: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853411.69398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853411.69437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4jnoji4s /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py <<< 18714 1726853411.69441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py" <<< 18714 1726853411.69635: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4jnoji4s" to remote "/root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py" <<< 18714 1726853411.70899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.71077: stderr chunk (state=3): >>><<< 18714 1726853411.71088: stdout chunk (state=3): >>><<< 18714 1726853411.71382: done transferring module to remote 18714 1726853411.71385: _low_level_execute_command(): starting 18714 1726853411.71388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/ /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py && sleep 0' 18714 1726853411.72477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.72492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.72665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.72803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.72807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.72809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.74900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.74903: stdout chunk (state=3): >>><<< 18714 1726853411.74907: stderr chunk (state=3): >>><<< 18714 1726853411.74993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.74996: _low_level_execute_command(): starting 18714 1726853411.74999: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/AnsiballZ_command.py && sleep 0' 18714 1726853411.75883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.75901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853411.75917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.75940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853411.75987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853411.76054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853411.76087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.76134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.76332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.91898: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 13:30:11.914644", "end": "2024-09-20 13:30:11.918123", "delta": "0:00:00.003479", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853411.93586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853411.93590: stdout chunk (state=3): >>><<< 18714 1726853411.93593: stderr chunk (state=3): >>><<< 18714 1726853411.93595: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 13:30:11.914644", "end": "2024-09-20 13:30:11.918123", "delta": "0:00:00.003479", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853411.93603: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853411.93606: _low_level_execute_command(): starting 18714 1726853411.93608: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853411.6180263-19061-128002399288512/ > /dev/null 2>&1 && sleep 0' 18714 1726853411.94241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.94263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853411.94376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.94402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.94474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853411.96332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853411.96384: stdout chunk (state=3): >>><<< 18714 1726853411.96387: stderr chunk (state=3): >>><<< 18714 1726853411.96404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853411.96414: handler run complete 18714 1726853411.96584: Evaluated conditional (False): False 18714 1726853411.96587: attempt loop complete, returning result 18714 1726853411.96589: variable 'item' from source: unknown 18714 1726853411.96591: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003479", "end": "2024-09-20 13:30:11.918123", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 13:30:11.914644" } 18714 1726853411.96842: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.96846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.96848: variable 'omit' from source: magic vars 18714 1726853411.97041: variable 'ansible_distribution_major_version' from source: facts 18714 1726853411.97103: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853411.97298: variable 'type' from source: set_fact 18714 1726853411.97307: variable 'state' from source: include params 18714 1726853411.97317: variable 'interface' from source: set_fact 18714 1726853411.97325: variable 'current_interfaces' from source: set_fact 18714 1726853411.97336: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18714 1726853411.97362: variable 'omit' from source: magic vars 18714 1726853411.97376: variable 'omit' from source: magic vars 18714 1726853411.97422: variable 'item' from source: unknown 18714 1726853411.97540: variable 'item' from source: unknown 18714 1726853411.97543: variable 'omit' from source: magic vars 18714 1726853411.97599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853411.97606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853411.97608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853411.97621: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853411.97629: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.97636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.97775: Set connection var ansible_shell_executable to /bin/sh 18714 1726853411.97779: Set connection var ansible_timeout to 10 18714 1726853411.97781: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853411.97783: Set connection var ansible_connection to ssh 18714 1726853411.97785: Set connection var ansible_shell_type to sh 18714 1726853411.97787: Set connection var ansible_pipelining to False 18714 1726853411.97806: variable 'ansible_shell_executable' from source: unknown 18714 1726853411.97823: variable 'ansible_connection' from source: unknown 18714 1726853411.97837: variable 'ansible_module_compression' from source: unknown 18714 1726853411.97876: variable 'ansible_shell_type' from source: unknown 18714 1726853411.97879: variable 'ansible_shell_executable' from source: unknown 18714 1726853411.97881: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853411.97883: variable 'ansible_pipelining' from source: unknown 18714 1726853411.97885: variable 'ansible_timeout' from source: unknown 18714 1726853411.97887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853411.97993: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853411.98013: variable 'omit' from source: magic vars 18714 1726853411.98040: starting attempt loop 18714 1726853411.98043: running the handler 18714 1726853411.98046: _low_level_execute_command(): starting 18714 1726853411.98124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853411.98759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853411.98788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853411.98900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853411.98931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853411.99001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.00602: stdout chunk (state=3): >>>/root <<< 18714 1726853412.00783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.00786: stdout chunk (state=3): >>><<< 18714 1726853412.00788: stderr chunk (state=3): >>><<< 18714 1726853412.00793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.00825: _low_level_execute_command(): starting 18714 1726853412.00829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132 `" && echo ansible-tmp-1726853412.0079777-19061-281245082912132="` echo /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132 `" ) && sleep 0' 18714 1726853412.01263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.01266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853412.01268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853412.01273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.01275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853412.01277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.01321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.01325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.01375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.03307: stdout chunk (state=3): >>>ansible-tmp-1726853412.0079777-19061-281245082912132=/root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132 <<< 18714 1726853412.03408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.03432: stderr chunk (state=3): >>><<< 18714 1726853412.03444: stdout chunk (state=3): >>><<< 18714 1726853412.03460: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853412.0079777-19061-281245082912132=/root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.03475: variable 'ansible_module_compression' from source: unknown 18714 1726853412.03508: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853412.03522: variable 'ansible_facts' from source: unknown 18714 1726853412.03569: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py 18714 1726853412.03656: Sending initial data 18714 1726853412.03660: Sent initial data (156 bytes) 18714 1726853412.04093: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.04096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.04106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853412.04108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853412.04110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.04150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.04154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.04201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.05806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853412.05857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853412.05898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpiy4di4bo /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py <<< 18714 1726853412.05908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py" <<< 18714 1726853412.05939: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpiy4di4bo" to remote "/root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py" <<< 18714 1726853412.06683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.06716: stderr chunk (state=3): >>><<< 18714 1726853412.06722: stdout chunk (state=3): >>><<< 18714 1726853412.06749: done transferring module to remote 18714 1726853412.06756: _low_level_execute_command(): starting 18714 1726853412.06760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/ /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py && sleep 0' 18714 1726853412.07176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.07179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853412.07181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.07184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.07185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853412.07187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.07231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.07235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.07281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.09187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.09198: stdout chunk (state=3): >>><<< 18714 1726853412.09201: stderr chunk (state=3): >>><<< 18714 1726853412.09203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.09205: _low_level_execute_command(): starting 18714 1726853412.09207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/AnsiballZ_command.py && sleep 0' 18714 1726853412.09684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853412.09697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853412.09706: stderr chunk (state=3): >>>debug2: match found <<< 18714 1726853412.09716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.09801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853412.09815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.09911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.25593: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 13:30:12.251199", "end": "2024-09-20 13:30:12.254827", "delta": "0:00:00.003628", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853412.27067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853412.27095: stderr chunk (state=3): >>><<< 18714 1726853412.27098: stdout chunk (state=3): >>><<< 18714 1726853412.27113: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 13:30:12.251199", "end": "2024-09-20 13:30:12.254827", "delta": "0:00:00.003628", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853412.27139: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853412.27142: _low_level_execute_command(): starting 18714 1726853412.27148: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853412.0079777-19061-281245082912132/ > /dev/null 2>&1 && sleep 0' 18714 1726853412.27556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.27592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853412.27595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.27597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853412.27599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.27648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.27656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853412.27658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.27700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.29498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.29528: stderr chunk (state=3): >>><<< 18714 1726853412.29531: stdout chunk (state=3): >>><<< 18714 1726853412.29544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.29548: handler run complete 18714 1726853412.29564: Evaluated conditional (False): False 18714 1726853412.29573: attempt loop complete, returning result 18714 1726853412.29589: variable 'item' from source: unknown 18714 1726853412.29654: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003628", "end": "2024-09-20 13:30:12.254827", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 13:30:12.251199" } 18714 1726853412.29765: dumping result to json 18714 1726853412.29768: done dumping result, returning 18714 1726853412.29770: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 [02083763-bbaf-e784-4f7d-000000000135] 18714 1726853412.29774: sending task result for task 02083763-bbaf-e784-4f7d-000000000135 18714 1726853412.29818: done sending task result for task 02083763-bbaf-e784-4f7d-000000000135 18714 1726853412.29821: WORKER PROCESS EXITING 18714 1726853412.29881: no more pending results, returning what we have 18714 1726853412.29884: results queue empty 18714 1726853412.29885: checking for any_errors_fatal 18714 1726853412.29897: done checking for any_errors_fatal 18714 1726853412.29897: checking for max_fail_percentage 18714 1726853412.29899: done checking for max_fail_percentage 18714 1726853412.29899: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.29900: done checking to see if all hosts have failed 18714 1726853412.29901: getting the remaining hosts for this loop 18714 1726853412.29902: done getting the remaining hosts for this loop 18714 1726853412.29905: getting the next task for host managed_node1 18714 1726853412.29912: done getting next task for host managed_node1 18714 1726853412.29915: ^ task is: TASK: Set up veth as managed by NetworkManager 18714 1726853412.29917: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.29920: getting variables 18714 1726853412.29922: in VariableManager get_vars() 18714 1726853412.29953: Calling all_inventory to load vars for managed_node1 18714 1726853412.29956: Calling groups_inventory to load vars for managed_node1 18714 1726853412.29959: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.29969: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.29979: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.29983: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.30145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.30268: done with get_vars() 18714 1726853412.30277: done getting variables 18714 1726853412.30321: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:30:12 -0400 (0:00:01.211) 0:00:08.686 ****** 18714 1726853412.30341: entering _queue_task() for managed_node1/command 18714 1726853412.30545: worker is 1 (out of 1 available) 18714 1726853412.30559: exiting _queue_task() for managed_node1/command 18714 1726853412.30574: done queuing things up, now waiting for results queue to drain 18714 1726853412.30575: waiting for pending results... 18714 1726853412.30723: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 18714 1726853412.30785: in run() - task 02083763-bbaf-e784-4f7d-000000000136 18714 1726853412.30801: variable 'ansible_search_path' from source: unknown 18714 1726853412.30805: variable 'ansible_search_path' from source: unknown 18714 1726853412.30830: calling self._execute() 18714 1726853412.30889: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.30893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.30903: variable 'omit' from source: magic vars 18714 1726853412.31172: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.31181: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.31281: variable 'type' from source: set_fact 18714 1726853412.31284: variable 'state' from source: include params 18714 1726853412.31290: Evaluated conditional (type == 'veth' and state == 'present'): True 18714 1726853412.31295: variable 'omit' from source: magic vars 18714 1726853412.31322: variable 'omit' from source: magic vars 18714 1726853412.31393: variable 'interface' from source: set_fact 18714 1726853412.31406: variable 'omit' from source: magic vars 18714 1726853412.31438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853412.31469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853412.31486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853412.31499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853412.31508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853412.31532: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853412.31535: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.31537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.31609: Set connection var ansible_shell_executable to /bin/sh 18714 1726853412.31614: Set connection var ansible_timeout to 10 18714 1726853412.31619: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853412.31625: Set connection var ansible_connection to ssh 18714 1726853412.31630: Set connection var ansible_shell_type to sh 18714 1726853412.31635: Set connection var ansible_pipelining to False 18714 1726853412.31653: variable 'ansible_shell_executable' from source: unknown 18714 1726853412.31656: variable 'ansible_connection' from source: unknown 18714 1726853412.31660: variable 'ansible_module_compression' from source: unknown 18714 1726853412.31662: variable 'ansible_shell_type' from source: unknown 18714 1726853412.31666: variable 'ansible_shell_executable' from source: unknown 18714 1726853412.31668: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.31673: variable 'ansible_pipelining' from source: unknown 18714 1726853412.31675: variable 'ansible_timeout' from source: unknown 18714 1726853412.31677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.31770: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853412.31779: variable 'omit' from source: magic vars 18714 1726853412.31785: starting attempt loop 18714 1726853412.31798: running the handler 18714 1726853412.31801: _low_level_execute_command(): starting 18714 1726853412.31808: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853412.32309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.32312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.32315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.32317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.32370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.32375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.32424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.33998: stdout chunk (state=3): >>>/root <<< 18714 1726853412.34098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.34125: stderr chunk (state=3): >>><<< 18714 1726853412.34128: stdout chunk (state=3): >>><<< 18714 1726853412.34147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.34163: _low_level_execute_command(): starting 18714 1726853412.34173: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034 `" && echo ansible-tmp-1726853412.341473-19128-154038137380034="` echo /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034 `" ) && sleep 0' 18714 1726853412.34603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.34608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853412.34619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.34621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.34624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.34665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.34668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.34711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.36578: stdout chunk (state=3): >>>ansible-tmp-1726853412.341473-19128-154038137380034=/root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034 <<< 18714 1726853412.36684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.36715: stderr chunk (state=3): >>><<< 18714 1726853412.36718: stdout chunk (state=3): >>><<< 18714 1726853412.36734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853412.341473-19128-154038137380034=/root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.36760: variable 'ansible_module_compression' from source: unknown 18714 1726853412.36807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853412.36837: variable 'ansible_facts' from source: unknown 18714 1726853412.36898: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py 18714 1726853412.36995: Sending initial data 18714 1726853412.36998: Sent initial data (155 bytes) 18714 1726853412.37455: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.37458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853412.37460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.37463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853412.37465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.37515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.37518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853412.37522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.37561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.39080: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18714 1726853412.39086: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853412.39112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853412.39149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpxlu3sopl /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py <<< 18714 1726853412.39158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py" <<< 18714 1726853412.39198: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpxlu3sopl" to remote "/root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py" <<< 18714 1726853412.39201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py" <<< 18714 1726853412.39716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.39758: stderr chunk (state=3): >>><<< 18714 1726853412.39761: stdout chunk (state=3): >>><<< 18714 1726853412.39805: done transferring module to remote 18714 1726853412.39813: _low_level_execute_command(): starting 18714 1726853412.39818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/ /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py && sleep 0' 18714 1726853412.40268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.40275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853412.40277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.40286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.40288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853412.40290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.40334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.40338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853412.40342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.40384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.42104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.42133: stderr chunk (state=3): >>><<< 18714 1726853412.42136: stdout chunk (state=3): >>><<< 18714 1726853412.42154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.42157: _low_level_execute_command(): starting 18714 1726853412.42160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/AnsiballZ_command.py && sleep 0' 18714 1726853412.42612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.42616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853412.42618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853412.42620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.42622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.42727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.42753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.59767: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 13:30:12.577798", "end": "2024-09-20 13:30:12.596426", "delta": "0:00:00.018628", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853412.61438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853412.61457: stderr chunk (state=3): >>><<< 18714 1726853412.61469: stdout chunk (state=3): >>><<< 18714 1726853412.61506: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 13:30:12.577798", "end": "2024-09-20 13:30:12.596426", "delta": "0:00:00.018628", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853412.61577: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853412.61581: _low_level_execute_command(): starting 18714 1726853412.61583: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853412.341473-19128-154038137380034/ > /dev/null 2>&1 && sleep 0' 18714 1726853412.62232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853412.62254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853412.62269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853412.62289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853412.62338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853412.62403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853412.62421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853412.62452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.62558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853412.64779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853412.64783: stdout chunk (state=3): >>><<< 18714 1726853412.64786: stderr chunk (state=3): >>><<< 18714 1726853412.64788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853412.64790: handler run complete 18714 1726853412.64792: Evaluated conditional (False): False 18714 1726853412.64794: attempt loop complete, returning result 18714 1726853412.64796: _execute() done 18714 1726853412.64798: dumping result to json 18714 1726853412.64800: done dumping result, returning 18714 1726853412.64802: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-e784-4f7d-000000000136] 18714 1726853412.64804: sending task result for task 02083763-bbaf-e784-4f7d-000000000136 18714 1726853412.64877: done sending task result for task 02083763-bbaf-e784-4f7d-000000000136 18714 1726853412.64881: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.018628", "end": "2024-09-20 13:30:12.596426", "rc": 0, "start": "2024-09-20 13:30:12.577798" } 18714 1726853412.64948: no more pending results, returning what we have 18714 1726853412.64954: results queue empty 18714 1726853412.64955: checking for any_errors_fatal 18714 1726853412.64969: done checking for any_errors_fatal 18714 1726853412.64970: checking for max_fail_percentage 18714 1726853412.64974: done checking for max_fail_percentage 18714 1726853412.64975: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.64976: done checking to see if all hosts have failed 18714 1726853412.64976: getting the remaining hosts for this loop 18714 1726853412.64978: done getting the remaining hosts for this loop 18714 1726853412.64981: getting the next task for host managed_node1 18714 1726853412.64988: done getting next task for host managed_node1 18714 1726853412.64991: ^ task is: TASK: Delete veth interface {{ interface }} 18714 1726853412.64994: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.64997: getting variables 18714 1726853412.64999: in VariableManager get_vars() 18714 1726853412.65030: Calling all_inventory to load vars for managed_node1 18714 1726853412.65033: Calling groups_inventory to load vars for managed_node1 18714 1726853412.65036: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.65048: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.65053: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.65056: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.65638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.66298: done with get_vars() 18714 1726853412.66310: done getting variables 18714 1726853412.66372: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853412.66692: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:30:12 -0400 (0:00:00.363) 0:00:09.050 ****** 18714 1726853412.66723: entering _queue_task() for managed_node1/command 18714 1726853412.67226: worker is 1 (out of 1 available) 18714 1726853412.67241: exiting _queue_task() for managed_node1/command 18714 1726853412.67257: done queuing things up, now waiting for results queue to drain 18714 1726853412.67258: waiting for pending results... 18714 1726853412.67710: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 18714 1726853412.67914: in run() - task 02083763-bbaf-e784-4f7d-000000000137 18714 1726853412.67918: variable 'ansible_search_path' from source: unknown 18714 1726853412.67921: variable 'ansible_search_path' from source: unknown 18714 1726853412.68022: calling self._execute() 18714 1726853412.68181: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.68194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.68210: variable 'omit' from source: magic vars 18714 1726853412.68596: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.68615: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.68828: variable 'type' from source: set_fact 18714 1726853412.68838: variable 'state' from source: include params 18714 1726853412.68846: variable 'interface' from source: set_fact 18714 1726853412.68854: variable 'current_interfaces' from source: set_fact 18714 1726853412.68866: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 18714 1726853412.68875: when evaluation is False, skipping this task 18714 1726853412.68884: _execute() done 18714 1726853412.68897: dumping result to json 18714 1726853412.68904: done dumping result, returning 18714 1726853412.68913: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 [02083763-bbaf-e784-4f7d-000000000137] 18714 1726853412.68975: sending task result for task 02083763-bbaf-e784-4f7d-000000000137 18714 1726853412.69047: done sending task result for task 02083763-bbaf-e784-4f7d-000000000137 18714 1726853412.69051: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18714 1726853412.69101: no more pending results, returning what we have 18714 1726853412.69105: results queue empty 18714 1726853412.69106: checking for any_errors_fatal 18714 1726853412.69113: done checking for any_errors_fatal 18714 1726853412.69114: checking for max_fail_percentage 18714 1726853412.69116: done checking for max_fail_percentage 18714 1726853412.69116: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.69117: done checking to see if all hosts have failed 18714 1726853412.69118: getting the remaining hosts for this loop 18714 1726853412.69119: done getting the remaining hosts for this loop 18714 1726853412.69123: getting the next task for host managed_node1 18714 1726853412.69130: done getting next task for host managed_node1 18714 1726853412.69132: ^ task is: TASK: Create dummy interface {{ interface }} 18714 1726853412.69135: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.69139: getting variables 18714 1726853412.69141: in VariableManager get_vars() 18714 1726853412.69175: Calling all_inventory to load vars for managed_node1 18714 1726853412.69178: Calling groups_inventory to load vars for managed_node1 18714 1726853412.69181: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.69194: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.69196: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.69199: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.69483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.70183: done with get_vars() 18714 1726853412.70193: done getting variables 18714 1726853412.70255: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853412.70585: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:30:12 -0400 (0:00:00.038) 0:00:09.089 ****** 18714 1726853412.70617: entering _queue_task() for managed_node1/command 18714 1726853412.71159: worker is 1 (out of 1 available) 18714 1726853412.71377: exiting _queue_task() for managed_node1/command 18714 1726853412.71390: done queuing things up, now waiting for results queue to drain 18714 1726853412.71391: waiting for pending results... 18714 1726853412.71792: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 18714 1726853412.71797: in run() - task 02083763-bbaf-e784-4f7d-000000000138 18714 1726853412.71800: variable 'ansible_search_path' from source: unknown 18714 1726853412.71803: variable 'ansible_search_path' from source: unknown 18714 1726853412.71925: calling self._execute() 18714 1726853412.72176: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.72180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.72183: variable 'omit' from source: magic vars 18714 1726853412.72923: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.72994: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.73385: variable 'type' from source: set_fact 18714 1726853412.73476: variable 'state' from source: include params 18714 1726853412.73495: variable 'interface' from source: set_fact 18714 1726853412.73526: variable 'current_interfaces' from source: set_fact 18714 1726853412.73538: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 18714 1726853412.73544: when evaluation is False, skipping this task 18714 1726853412.73580: _execute() done 18714 1726853412.73586: dumping result to json 18714 1726853412.73592: done dumping result, returning 18714 1726853412.73601: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 [02083763-bbaf-e784-4f7d-000000000138] 18714 1726853412.73607: sending task result for task 02083763-bbaf-e784-4f7d-000000000138 18714 1726853412.73907: done sending task result for task 02083763-bbaf-e784-4f7d-000000000138 18714 1726853412.73910: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18714 1726853412.74017: no more pending results, returning what we have 18714 1726853412.74020: results queue empty 18714 1726853412.74021: checking for any_errors_fatal 18714 1726853412.74028: done checking for any_errors_fatal 18714 1726853412.74029: checking for max_fail_percentage 18714 1726853412.74030: done checking for max_fail_percentage 18714 1726853412.74031: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.74032: done checking to see if all hosts have failed 18714 1726853412.74032: getting the remaining hosts for this loop 18714 1726853412.74034: done getting the remaining hosts for this loop 18714 1726853412.74037: getting the next task for host managed_node1 18714 1726853412.74043: done getting next task for host managed_node1 18714 1726853412.74045: ^ task is: TASK: Delete dummy interface {{ interface }} 18714 1726853412.74048: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.74053: getting variables 18714 1726853412.74055: in VariableManager get_vars() 18714 1726853412.74085: Calling all_inventory to load vars for managed_node1 18714 1726853412.74088: Calling groups_inventory to load vars for managed_node1 18714 1726853412.74091: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.74101: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.74104: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.74106: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.74282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.74668: done with get_vars() 18714 1726853412.74882: done getting variables 18714 1726853412.74941: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853412.75053: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:30:12 -0400 (0:00:00.044) 0:00:09.134 ****** 18714 1726853412.75084: entering _queue_task() for managed_node1/command 18714 1726853412.75748: worker is 1 (out of 1 available) 18714 1726853412.75765: exiting _queue_task() for managed_node1/command 18714 1726853412.75883: done queuing things up, now waiting for results queue to drain 18714 1726853412.75885: waiting for pending results... 18714 1726853412.76386: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 18714 1726853412.76411: in run() - task 02083763-bbaf-e784-4f7d-000000000139 18714 1726853412.76432: variable 'ansible_search_path' from source: unknown 18714 1726853412.76576: variable 'ansible_search_path' from source: unknown 18714 1726853412.76581: calling self._execute() 18714 1726853412.76637: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.76648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.76663: variable 'omit' from source: magic vars 18714 1726853412.77022: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.77041: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.77243: variable 'type' from source: set_fact 18714 1726853412.77253: variable 'state' from source: include params 18714 1726853412.77377: variable 'interface' from source: set_fact 18714 1726853412.77380: variable 'current_interfaces' from source: set_fact 18714 1726853412.77383: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 18714 1726853412.77385: when evaluation is False, skipping this task 18714 1726853412.77387: _execute() done 18714 1726853412.77389: dumping result to json 18714 1726853412.77391: done dumping result, returning 18714 1726853412.77393: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 [02083763-bbaf-e784-4f7d-000000000139] 18714 1726853412.77395: sending task result for task 02083763-bbaf-e784-4f7d-000000000139 18714 1726853412.77453: done sending task result for task 02083763-bbaf-e784-4f7d-000000000139 18714 1726853412.77455: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18714 1726853412.77607: no more pending results, returning what we have 18714 1726853412.77609: results queue empty 18714 1726853412.77610: checking for any_errors_fatal 18714 1726853412.77615: done checking for any_errors_fatal 18714 1726853412.77616: checking for max_fail_percentage 18714 1726853412.77617: done checking for max_fail_percentage 18714 1726853412.77618: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.77618: done checking to see if all hosts have failed 18714 1726853412.77619: getting the remaining hosts for this loop 18714 1726853412.77620: done getting the remaining hosts for this loop 18714 1726853412.77623: getting the next task for host managed_node1 18714 1726853412.77627: done getting next task for host managed_node1 18714 1726853412.77629: ^ task is: TASK: Create tap interface {{ interface }} 18714 1726853412.77632: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.77635: getting variables 18714 1726853412.77636: in VariableManager get_vars() 18714 1726853412.77662: Calling all_inventory to load vars for managed_node1 18714 1726853412.77665: Calling groups_inventory to load vars for managed_node1 18714 1726853412.77668: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.77680: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.77683: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.77686: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.78020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.78502: done with get_vars() 18714 1726853412.78511: done getting variables 18714 1726853412.78567: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853412.78874: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:30:12 -0400 (0:00:00.038) 0:00:09.172 ****** 18714 1726853412.78903: entering _queue_task() for managed_node1/command 18714 1726853412.79385: worker is 1 (out of 1 available) 18714 1726853412.79399: exiting _queue_task() for managed_node1/command 18714 1726853412.79410: done queuing things up, now waiting for results queue to drain 18714 1726853412.79411: waiting for pending results... 18714 1726853412.79989: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 18714 1726853412.80178: in run() - task 02083763-bbaf-e784-4f7d-00000000013a 18714 1726853412.80183: variable 'ansible_search_path' from source: unknown 18714 1726853412.80186: variable 'ansible_search_path' from source: unknown 18714 1726853412.80191: calling self._execute() 18714 1726853412.80415: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.80419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.80421: variable 'omit' from source: magic vars 18714 1726853412.81178: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.81182: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.81584: variable 'type' from source: set_fact 18714 1726853412.81594: variable 'state' from source: include params 18714 1726853412.81602: variable 'interface' from source: set_fact 18714 1726853412.81615: variable 'current_interfaces' from source: set_fact 18714 1726853412.81627: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 18714 1726853412.81720: when evaluation is False, skipping this task 18714 1726853412.81723: _execute() done 18714 1726853412.81726: dumping result to json 18714 1726853412.81728: done dumping result, returning 18714 1726853412.81730: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 [02083763-bbaf-e784-4f7d-00000000013a] 18714 1726853412.81734: sending task result for task 02083763-bbaf-e784-4f7d-00000000013a 18714 1726853412.82233: done sending task result for task 02083763-bbaf-e784-4f7d-00000000013a 18714 1726853412.82236: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18714 1726853412.82322: no more pending results, returning what we have 18714 1726853412.82325: results queue empty 18714 1726853412.82326: checking for any_errors_fatal 18714 1726853412.82331: done checking for any_errors_fatal 18714 1726853412.82332: checking for max_fail_percentage 18714 1726853412.82334: done checking for max_fail_percentage 18714 1726853412.82334: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.82335: done checking to see if all hosts have failed 18714 1726853412.82336: getting the remaining hosts for this loop 18714 1726853412.82337: done getting the remaining hosts for this loop 18714 1726853412.82341: getting the next task for host managed_node1 18714 1726853412.82346: done getting next task for host managed_node1 18714 1726853412.82349: ^ task is: TASK: Delete tap interface {{ interface }} 18714 1726853412.82354: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.82359: getting variables 18714 1726853412.82360: in VariableManager get_vars() 18714 1726853412.82389: Calling all_inventory to load vars for managed_node1 18714 1726853412.82392: Calling groups_inventory to load vars for managed_node1 18714 1726853412.82396: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.82406: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.82409: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.82412: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.82797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.83202: done with get_vars() 18714 1726853412.83212: done getting variables 18714 1726853412.83473: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853412.83587: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:30:12 -0400 (0:00:00.047) 0:00:09.219 ****** 18714 1726853412.83617: entering _queue_task() for managed_node1/command 18714 1726853412.84287: worker is 1 (out of 1 available) 18714 1726853412.84302: exiting _queue_task() for managed_node1/command 18714 1726853412.84315: done queuing things up, now waiting for results queue to drain 18714 1726853412.84315: waiting for pending results... 18714 1726853412.84716: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 18714 1726853412.84887: in run() - task 02083763-bbaf-e784-4f7d-00000000013b 18714 1726853412.84993: variable 'ansible_search_path' from source: unknown 18714 1726853412.85000: variable 'ansible_search_path' from source: unknown 18714 1726853412.85044: calling self._execute() 18714 1726853412.85239: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.85250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.85264: variable 'omit' from source: magic vars 18714 1726853412.85927: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.86104: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.86308: variable 'type' from source: set_fact 18714 1726853412.86437: variable 'state' from source: include params 18714 1726853412.86447: variable 'interface' from source: set_fact 18714 1726853412.86456: variable 'current_interfaces' from source: set_fact 18714 1726853412.86468: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 18714 1726853412.86478: when evaluation is False, skipping this task 18714 1726853412.86487: _execute() done 18714 1726853412.86495: dumping result to json 18714 1726853412.86503: done dumping result, returning 18714 1726853412.86513: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 [02083763-bbaf-e784-4f7d-00000000013b] 18714 1726853412.86521: sending task result for task 02083763-bbaf-e784-4f7d-00000000013b skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18714 1726853412.86677: no more pending results, returning what we have 18714 1726853412.86681: results queue empty 18714 1726853412.86682: checking for any_errors_fatal 18714 1726853412.86690: done checking for any_errors_fatal 18714 1726853412.86691: checking for max_fail_percentage 18714 1726853412.86692: done checking for max_fail_percentage 18714 1726853412.86693: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.86694: done checking to see if all hosts have failed 18714 1726853412.86694: getting the remaining hosts for this loop 18714 1726853412.86696: done getting the remaining hosts for this loop 18714 1726853412.86699: getting the next task for host managed_node1 18714 1726853412.86708: done getting next task for host managed_node1 18714 1726853412.86710: ^ task is: TASK: Include the task 'assert_device_present.yml' 18714 1726853412.86712: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.86715: getting variables 18714 1726853412.86717: in VariableManager get_vars() 18714 1726853412.86744: Calling all_inventory to load vars for managed_node1 18714 1726853412.86747: Calling groups_inventory to load vars for managed_node1 18714 1726853412.86752: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.86764: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.86766: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.86769: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.87223: done sending task result for task 02083763-bbaf-e784-4f7d-00000000013b 18714 1726853412.87228: WORKER PROCESS EXITING 18714 1726853412.87486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.87786: done with get_vars() 18714 1726853412.87795: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 13:30:12 -0400 (0:00:00.042) 0:00:09.262 ****** 18714 1726853412.87885: entering _queue_task() for managed_node1/include_tasks 18714 1726853412.88343: worker is 1 (out of 1 available) 18714 1726853412.88359: exiting _queue_task() for managed_node1/include_tasks 18714 1726853412.88491: done queuing things up, now waiting for results queue to drain 18714 1726853412.88492: waiting for pending results... 18714 1726853412.88629: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 18714 1726853412.88755: in run() - task 02083763-bbaf-e784-4f7d-000000000012 18714 1726853412.88790: variable 'ansible_search_path' from source: unknown 18714 1726853412.88834: calling self._execute() 18714 1726853412.88912: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.88924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.88955: variable 'omit' from source: magic vars 18714 1726853412.89305: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.89319: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.89330: _execute() done 18714 1726853412.89337: dumping result to json 18714 1726853412.89344: done dumping result, returning 18714 1726853412.89357: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-e784-4f7d-000000000012] 18714 1726853412.89370: sending task result for task 02083763-bbaf-e784-4f7d-000000000012 18714 1726853412.89488: no more pending results, returning what we have 18714 1726853412.89492: in VariableManager get_vars() 18714 1726853412.89524: Calling all_inventory to load vars for managed_node1 18714 1726853412.89527: Calling groups_inventory to load vars for managed_node1 18714 1726853412.89531: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.89542: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.89545: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.89548: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.89946: done sending task result for task 02083763-bbaf-e784-4f7d-000000000012 18714 1726853412.89951: WORKER PROCESS EXITING 18714 1726853412.89976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.90183: done with get_vars() 18714 1726853412.90189: variable 'ansible_search_path' from source: unknown 18714 1726853412.90201: we have included files to process 18714 1726853412.90202: generating all_blocks data 18714 1726853412.90203: done generating all_blocks data 18714 1726853412.90207: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18714 1726853412.90208: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18714 1726853412.90210: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18714 1726853412.90377: in VariableManager get_vars() 18714 1726853412.90393: done with get_vars() 18714 1726853412.90579: done processing included file 18714 1726853412.90581: iterating over new_blocks loaded from include file 18714 1726853412.90583: in VariableManager get_vars() 18714 1726853412.90592: done with get_vars() 18714 1726853412.90594: filtering new block on tags 18714 1726853412.90609: done filtering new block on tags 18714 1726853412.90611: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 18714 1726853412.90615: extending task lists for all hosts with included blocks 18714 1726853412.91646: done extending task lists 18714 1726853412.91648: done processing included files 18714 1726853412.91651: results queue empty 18714 1726853412.91652: checking for any_errors_fatal 18714 1726853412.91655: done checking for any_errors_fatal 18714 1726853412.91655: checking for max_fail_percentage 18714 1726853412.91656: done checking for max_fail_percentage 18714 1726853412.91657: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.91658: done checking to see if all hosts have failed 18714 1726853412.91659: getting the remaining hosts for this loop 18714 1726853412.91660: done getting the remaining hosts for this loop 18714 1726853412.91662: getting the next task for host managed_node1 18714 1726853412.91665: done getting next task for host managed_node1 18714 1726853412.91668: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18714 1726853412.91670: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.91674: getting variables 18714 1726853412.91675: in VariableManager get_vars() 18714 1726853412.91682: Calling all_inventory to load vars for managed_node1 18714 1726853412.91684: Calling groups_inventory to load vars for managed_node1 18714 1726853412.91686: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.91691: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.91693: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.91696: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.92033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.92344: done with get_vars() 18714 1726853412.92360: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:30:12 -0400 (0:00:00.045) 0:00:09.307 ****** 18714 1726853412.92439: entering _queue_task() for managed_node1/include_tasks 18714 1726853412.92703: worker is 1 (out of 1 available) 18714 1726853412.92827: exiting _queue_task() for managed_node1/include_tasks 18714 1726853412.92837: done queuing things up, now waiting for results queue to drain 18714 1726853412.92838: waiting for pending results... 18714 1726853412.92992: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 18714 1726853412.93106: in run() - task 02083763-bbaf-e784-4f7d-0000000001d3 18714 1726853412.93126: variable 'ansible_search_path' from source: unknown 18714 1726853412.93133: variable 'ansible_search_path' from source: unknown 18714 1726853412.93184: calling self._execute() 18714 1726853412.93269: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.93283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.93294: variable 'omit' from source: magic vars 18714 1726853412.93675: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.93699: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.93716: _execute() done 18714 1726853412.93724: dumping result to json 18714 1726853412.93732: done dumping result, returning 18714 1726853412.93742: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-e784-4f7d-0000000001d3] 18714 1726853412.93753: sending task result for task 02083763-bbaf-e784-4f7d-0000000001d3 18714 1726853412.93890: no more pending results, returning what we have 18714 1726853412.93895: in VariableManager get_vars() 18714 1726853412.94047: Calling all_inventory to load vars for managed_node1 18714 1726853412.94054: Calling groups_inventory to load vars for managed_node1 18714 1726853412.94058: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.94077: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.94081: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.94087: done sending task result for task 02083763-bbaf-e784-4f7d-0000000001d3 18714 1726853412.94090: WORKER PROCESS EXITING 18714 1726853412.94094: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.94470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.94734: done with get_vars() 18714 1726853412.94742: variable 'ansible_search_path' from source: unknown 18714 1726853412.94743: variable 'ansible_search_path' from source: unknown 18714 1726853412.94784: we have included files to process 18714 1726853412.94786: generating all_blocks data 18714 1726853412.94787: done generating all_blocks data 18714 1726853412.94789: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18714 1726853412.94790: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18714 1726853412.94792: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18714 1726853412.95022: done processing included file 18714 1726853412.95024: iterating over new_blocks loaded from include file 18714 1726853412.95026: in VariableManager get_vars() 18714 1726853412.95043: done with get_vars() 18714 1726853412.95045: filtering new block on tags 18714 1726853412.95062: done filtering new block on tags 18714 1726853412.95065: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 18714 1726853412.95069: extending task lists for all hosts with included blocks 18714 1726853412.95161: done extending task lists 18714 1726853412.95162: done processing included files 18714 1726853412.95163: results queue empty 18714 1726853412.95163: checking for any_errors_fatal 18714 1726853412.95167: done checking for any_errors_fatal 18714 1726853412.95167: checking for max_fail_percentage 18714 1726853412.95168: done checking for max_fail_percentage 18714 1726853412.95169: checking to see if all hosts have failed and the running result is not ok 18714 1726853412.95170: done checking to see if all hosts have failed 18714 1726853412.95170: getting the remaining hosts for this loop 18714 1726853412.95173: done getting the remaining hosts for this loop 18714 1726853412.95175: getting the next task for host managed_node1 18714 1726853412.95179: done getting next task for host managed_node1 18714 1726853412.95181: ^ task is: TASK: Get stat for interface {{ interface }} 18714 1726853412.95184: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853412.95186: getting variables 18714 1726853412.95187: in VariableManager get_vars() 18714 1726853412.95194: Calling all_inventory to load vars for managed_node1 18714 1726853412.95196: Calling groups_inventory to load vars for managed_node1 18714 1726853412.95199: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853412.95204: Calling all_plugins_play to load vars for managed_node1 18714 1726853412.95206: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853412.95209: Calling groups_plugins_play to load vars for managed_node1 18714 1726853412.95346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853412.95663: done with get_vars() 18714 1726853412.95692: done getting variables 18714 1726853412.95839: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:30:12 -0400 (0:00:00.034) 0:00:09.342 ****** 18714 1726853412.95870: entering _queue_task() for managed_node1/stat 18714 1726853412.96133: worker is 1 (out of 1 available) 18714 1726853412.96146: exiting _queue_task() for managed_node1/stat 18714 1726853412.96161: done queuing things up, now waiting for results queue to drain 18714 1726853412.96162: waiting for pending results... 18714 1726853412.96605: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 18714 1726853412.96712: in run() - task 02083763-bbaf-e784-4f7d-00000000021e 18714 1726853412.96995: variable 'ansible_search_path' from source: unknown 18714 1726853412.96998: variable 'ansible_search_path' from source: unknown 18714 1726853412.97002: calling self._execute() 18714 1726853412.97101: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.97137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.97163: variable 'omit' from source: magic vars 18714 1726853412.97623: variable 'ansible_distribution_major_version' from source: facts 18714 1726853412.97638: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853412.97648: variable 'omit' from source: magic vars 18714 1726853412.97701: variable 'omit' from source: magic vars 18714 1726853412.97799: variable 'interface' from source: set_fact 18714 1726853412.97820: variable 'omit' from source: magic vars 18714 1726853412.97863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853412.97906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853412.97929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853412.97983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853412.97986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853412.98001: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853412.98009: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.98016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.98118: Set connection var ansible_shell_executable to /bin/sh 18714 1726853412.98129: Set connection var ansible_timeout to 10 18714 1726853412.98138: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853412.98199: Set connection var ansible_connection to ssh 18714 1726853412.98202: Set connection var ansible_shell_type to sh 18714 1726853412.98204: Set connection var ansible_pipelining to False 18714 1726853412.98206: variable 'ansible_shell_executable' from source: unknown 18714 1726853412.98209: variable 'ansible_connection' from source: unknown 18714 1726853412.98211: variable 'ansible_module_compression' from source: unknown 18714 1726853412.98213: variable 'ansible_shell_type' from source: unknown 18714 1726853412.98215: variable 'ansible_shell_executable' from source: unknown 18714 1726853412.98217: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853412.98218: variable 'ansible_pipelining' from source: unknown 18714 1726853412.98220: variable 'ansible_timeout' from source: unknown 18714 1726853412.98229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853412.98424: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853412.98439: variable 'omit' from source: magic vars 18714 1726853412.98448: starting attempt loop 18714 1726853412.98454: running the handler 18714 1726853412.98525: _low_level_execute_command(): starting 18714 1726853412.98529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853412.99384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853412.99436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853412.99486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.01179: stdout chunk (state=3): >>>/root <<< 18714 1726853413.01255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.01321: stderr chunk (state=3): >>><<< 18714 1726853413.01329: stdout chunk (state=3): >>><<< 18714 1726853413.01365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.01387: _low_level_execute_command(): starting 18714 1726853413.01398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581 `" && echo ansible-tmp-1726853413.0137396-19154-187618806655581="` echo /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581 `" ) && sleep 0' 18714 1726853413.02013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.02026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853413.02039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853413.02059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853413.02115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.02184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.02199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.02233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.02297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.04178: stdout chunk (state=3): >>>ansible-tmp-1726853413.0137396-19154-187618806655581=/root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581 <<< 18714 1726853413.04354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.04358: stdout chunk (state=3): >>><<< 18714 1726853413.04360: stderr chunk (state=3): >>><<< 18714 1726853413.04385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853413.0137396-19154-187618806655581=/root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.04461: variable 'ansible_module_compression' from source: unknown 18714 1726853413.04512: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18714 1726853413.04553: variable 'ansible_facts' from source: unknown 18714 1726853413.04750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py 18714 1726853413.04798: Sending initial data 18714 1726853413.04920: Sent initial data (153 bytes) 18714 1726853413.05418: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.05435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853413.05542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.05567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.05635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.07155: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853413.07215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853413.07262: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpzj098901 /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py <<< 18714 1726853413.07274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py" <<< 18714 1726853413.07314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpzj098901" to remote "/root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py" <<< 18714 1726853413.08302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.08305: stdout chunk (state=3): >>><<< 18714 1726853413.08308: stderr chunk (state=3): >>><<< 18714 1726853413.08501: done transferring module to remote 18714 1726853413.08505: _low_level_execute_command(): starting 18714 1726853413.08507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/ /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py && sleep 0' 18714 1726853413.09065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.09082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853413.09097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853413.09114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853413.09134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853413.09146: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853413.09167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.09255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.09281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.09302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.09320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.09388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.11184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.11203: stdout chunk (state=3): >>><<< 18714 1726853413.11219: stderr chunk (state=3): >>><<< 18714 1726853413.11315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.11319: _low_level_execute_command(): starting 18714 1726853413.11321: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/AnsiballZ_stat.py && sleep 0' 18714 1726853413.11875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.11889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853413.11904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853413.12012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.12041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.12059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.12143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.27456: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28116, "dev": 23, "nlink": 1, "atime": 1726853411.47691, "mtime": 1726853411.47691, "ctime": 1726853411.47691, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18714 1726853413.28914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.28928: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 18714 1726853413.28980: stderr chunk (state=3): >>><<< 18714 1726853413.29239: stdout chunk (state=3): >>><<< 18714 1726853413.29243: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28116, "dev": 23, "nlink": 1, "atime": 1726853411.47691, "mtime": 1726853411.47691, "ctime": 1726853411.47691, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853413.29246: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853413.29248: _low_level_execute_command(): starting 18714 1726853413.29255: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853413.0137396-19154-187618806655581/ > /dev/null 2>&1 && sleep 0' 18714 1726853413.29888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853413.29930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853413.29942: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.29983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.29995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.30046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.31905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.31912: stderr chunk (state=3): >>><<< 18714 1726853413.31915: stdout chunk (state=3): >>><<< 18714 1726853413.31978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.31981: handler run complete 18714 1726853413.31987: attempt loop complete, returning result 18714 1726853413.31990: _execute() done 18714 1726853413.31994: dumping result to json 18714 1726853413.31999: done dumping result, returning 18714 1726853413.32010: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [02083763-bbaf-e784-4f7d-00000000021e] 18714 1726853413.32013: sending task result for task 02083763-bbaf-e784-4f7d-00000000021e 18714 1726853413.32120: done sending task result for task 02083763-bbaf-e784-4f7d-00000000021e 18714 1726853413.32123: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853411.47691, "block_size": 4096, "blocks": 0, "ctime": 1726853411.47691, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28116, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726853411.47691, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 18714 1726853413.32212: no more pending results, returning what we have 18714 1726853413.32216: results queue empty 18714 1726853413.32218: checking for any_errors_fatal 18714 1726853413.32219: done checking for any_errors_fatal 18714 1726853413.32220: checking for max_fail_percentage 18714 1726853413.32221: done checking for max_fail_percentage 18714 1726853413.32222: checking to see if all hosts have failed and the running result is not ok 18714 1726853413.32223: done checking to see if all hosts have failed 18714 1726853413.32223: getting the remaining hosts for this loop 18714 1726853413.32225: done getting the remaining hosts for this loop 18714 1726853413.32229: getting the next task for host managed_node1 18714 1726853413.32237: done getting next task for host managed_node1 18714 1726853413.32239: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 18714 1726853413.32243: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853413.32256: getting variables 18714 1726853413.32258: in VariableManager get_vars() 18714 1726853413.32360: Calling all_inventory to load vars for managed_node1 18714 1726853413.32363: Calling groups_inventory to load vars for managed_node1 18714 1726853413.32366: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853413.32378: Calling all_plugins_play to load vars for managed_node1 18714 1726853413.32381: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853413.32385: Calling groups_plugins_play to load vars for managed_node1 18714 1726853413.32519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853413.32634: done with get_vars() 18714 1726853413.32641: done getting variables 18714 1726853413.32715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18714 1726853413.32800: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:30:13 -0400 (0:00:00.369) 0:00:09.711 ****** 18714 1726853413.32823: entering _queue_task() for managed_node1/assert 18714 1726853413.32825: Creating lock for assert 18714 1726853413.33041: worker is 1 (out of 1 available) 18714 1726853413.33058: exiting _queue_task() for managed_node1/assert 18714 1726853413.33073: done queuing things up, now waiting for results queue to drain 18714 1726853413.33074: waiting for pending results... 18714 1726853413.33230: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' 18714 1726853413.33290: in run() - task 02083763-bbaf-e784-4f7d-0000000001d4 18714 1726853413.33306: variable 'ansible_search_path' from source: unknown 18714 1726853413.33310: variable 'ansible_search_path' from source: unknown 18714 1726853413.33337: calling self._execute() 18714 1726853413.33398: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853413.33405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853413.33416: variable 'omit' from source: magic vars 18714 1726853413.33667: variable 'ansible_distribution_major_version' from source: facts 18714 1726853413.33678: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853413.33683: variable 'omit' from source: magic vars 18714 1726853413.33708: variable 'omit' from source: magic vars 18714 1726853413.33777: variable 'interface' from source: set_fact 18714 1726853413.33790: variable 'omit' from source: magic vars 18714 1726853413.33821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853413.33856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853413.33866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853413.33880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853413.33889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853413.33913: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853413.33917: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853413.33920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853413.33991: Set connection var ansible_shell_executable to /bin/sh 18714 1726853413.33996: Set connection var ansible_timeout to 10 18714 1726853413.34001: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853413.34008: Set connection var ansible_connection to ssh 18714 1726853413.34012: Set connection var ansible_shell_type to sh 18714 1726853413.34017: Set connection var ansible_pipelining to False 18714 1726853413.34033: variable 'ansible_shell_executable' from source: unknown 18714 1726853413.34036: variable 'ansible_connection' from source: unknown 18714 1726853413.34038: variable 'ansible_module_compression' from source: unknown 18714 1726853413.34041: variable 'ansible_shell_type' from source: unknown 18714 1726853413.34043: variable 'ansible_shell_executable' from source: unknown 18714 1726853413.34045: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853413.34051: variable 'ansible_pipelining' from source: unknown 18714 1726853413.34054: variable 'ansible_timeout' from source: unknown 18714 1726853413.34057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853413.34153: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853413.34157: variable 'omit' from source: magic vars 18714 1726853413.34164: starting attempt loop 18714 1726853413.34167: running the handler 18714 1726853413.34254: variable 'interface_stat' from source: set_fact 18714 1726853413.34267: Evaluated conditional (interface_stat.stat.exists): True 18714 1726853413.34273: handler run complete 18714 1726853413.34289: attempt loop complete, returning result 18714 1726853413.34292: _execute() done 18714 1726853413.34295: dumping result to json 18714 1726853413.34297: done dumping result, returning 18714 1726853413.34299: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' [02083763-bbaf-e784-4f7d-0000000001d4] 18714 1726853413.34304: sending task result for task 02083763-bbaf-e784-4f7d-0000000001d4 18714 1726853413.34382: done sending task result for task 02083763-bbaf-e784-4f7d-0000000001d4 18714 1726853413.34385: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18714 1726853413.34484: no more pending results, returning what we have 18714 1726853413.34487: results queue empty 18714 1726853413.34488: checking for any_errors_fatal 18714 1726853413.34495: done checking for any_errors_fatal 18714 1726853413.34496: checking for max_fail_percentage 18714 1726853413.34498: done checking for max_fail_percentage 18714 1726853413.34499: checking to see if all hosts have failed and the running result is not ok 18714 1726853413.34499: done checking to see if all hosts have failed 18714 1726853413.34500: getting the remaining hosts for this loop 18714 1726853413.34501: done getting the remaining hosts for this loop 18714 1726853413.34504: getting the next task for host managed_node1 18714 1726853413.34511: done getting next task for host managed_node1 18714 1726853413.34513: ^ task is: TASK: meta (flush_handlers) 18714 1726853413.34515: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853413.34518: getting variables 18714 1726853413.34519: in VariableManager get_vars() 18714 1726853413.34643: Calling all_inventory to load vars for managed_node1 18714 1726853413.34646: Calling groups_inventory to load vars for managed_node1 18714 1726853413.34649: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853413.34658: Calling all_plugins_play to load vars for managed_node1 18714 1726853413.34661: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853413.34664: Calling groups_plugins_play to load vars for managed_node1 18714 1726853413.34830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853413.35054: done with get_vars() 18714 1726853413.35062: done getting variables 18714 1726853413.35126: in VariableManager get_vars() 18714 1726853413.35133: Calling all_inventory to load vars for managed_node1 18714 1726853413.35136: Calling groups_inventory to load vars for managed_node1 18714 1726853413.35138: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853413.35142: Calling all_plugins_play to load vars for managed_node1 18714 1726853413.35144: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853413.35147: Calling groups_plugins_play to load vars for managed_node1 18714 1726853413.35278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853413.35449: done with get_vars() 18714 1726853413.35462: done queuing things up, now waiting for results queue to drain 18714 1726853413.35463: results queue empty 18714 1726853413.35464: checking for any_errors_fatal 18714 1726853413.35467: done checking for any_errors_fatal 18714 1726853413.35467: checking for max_fail_percentage 18714 1726853413.35468: done checking for max_fail_percentage 18714 1726853413.35469: checking to see if all hosts have failed and the running result is not ok 18714 1726853413.35470: done checking to see if all hosts have failed 18714 1726853413.35477: getting the remaining hosts for this loop 18714 1726853413.35478: done getting the remaining hosts for this loop 18714 1726853413.35480: getting the next task for host managed_node1 18714 1726853413.35484: done getting next task for host managed_node1 18714 1726853413.35485: ^ task is: TASK: meta (flush_handlers) 18714 1726853413.35486: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853413.35488: getting variables 18714 1726853413.35489: in VariableManager get_vars() 18714 1726853413.35497: Calling all_inventory to load vars for managed_node1 18714 1726853413.35499: Calling groups_inventory to load vars for managed_node1 18714 1726853413.35501: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853413.35505: Calling all_plugins_play to load vars for managed_node1 18714 1726853413.35508: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853413.35510: Calling groups_plugins_play to load vars for managed_node1 18714 1726853413.35625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853413.35744: done with get_vars() 18714 1726853413.35749: done getting variables 18714 1726853413.35780: in VariableManager get_vars() 18714 1726853413.35786: Calling all_inventory to load vars for managed_node1 18714 1726853413.35787: Calling groups_inventory to load vars for managed_node1 18714 1726853413.35789: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853413.35796: Calling all_plugins_play to load vars for managed_node1 18714 1726853413.35799: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853413.35802: Calling groups_plugins_play to load vars for managed_node1 18714 1726853413.35879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853413.35987: done with get_vars() 18714 1726853413.35995: done queuing things up, now waiting for results queue to drain 18714 1726853413.35996: results queue empty 18714 1726853413.35997: checking for any_errors_fatal 18714 1726853413.35998: done checking for any_errors_fatal 18714 1726853413.35998: checking for max_fail_percentage 18714 1726853413.35999: done checking for max_fail_percentage 18714 1726853413.36000: checking to see if all hosts have failed and the running result is not ok 18714 1726853413.36001: done checking to see if all hosts have failed 18714 1726853413.36001: getting the remaining hosts for this loop 18714 1726853413.36002: done getting the remaining hosts for this loop 18714 1726853413.36003: getting the next task for host managed_node1 18714 1726853413.36005: done getting next task for host managed_node1 18714 1726853413.36005: ^ task is: None 18714 1726853413.36006: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853413.36007: done queuing things up, now waiting for results queue to drain 18714 1726853413.36007: results queue empty 18714 1726853413.36008: checking for any_errors_fatal 18714 1726853413.36008: done checking for any_errors_fatal 18714 1726853413.36008: checking for max_fail_percentage 18714 1726853413.36009: done checking for max_fail_percentage 18714 1726853413.36009: checking to see if all hosts have failed and the running result is not ok 18714 1726853413.36010: done checking to see if all hosts have failed 18714 1726853413.36011: getting the next task for host managed_node1 18714 1726853413.36012: done getting next task for host managed_node1 18714 1726853413.36013: ^ task is: None 18714 1726853413.36014: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853413.36046: in VariableManager get_vars() 18714 1726853413.36061: done with get_vars() 18714 1726853413.36064: in VariableManager get_vars() 18714 1726853413.36074: done with get_vars() 18714 1726853413.36077: variable 'omit' from source: magic vars 18714 1726853413.36096: in VariableManager get_vars() 18714 1726853413.36104: done with get_vars() 18714 1726853413.36117: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18714 1726853413.36475: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853413.36492: getting the remaining hosts for this loop 18714 1726853413.36493: done getting the remaining hosts for this loop 18714 1726853413.36495: getting the next task for host managed_node1 18714 1726853413.36496: done getting next task for host managed_node1 18714 1726853413.36498: ^ task is: TASK: Gathering Facts 18714 1726853413.36498: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853413.36500: getting variables 18714 1726853413.36500: in VariableManager get_vars() 18714 1726853413.36530: Calling all_inventory to load vars for managed_node1 18714 1726853413.36532: Calling groups_inventory to load vars for managed_node1 18714 1726853413.36533: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853413.36536: Calling all_plugins_play to load vars for managed_node1 18714 1726853413.36537: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853413.36539: Calling groups_plugins_play to load vars for managed_node1 18714 1726853413.36619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853413.36724: done with get_vars() 18714 1726853413.36730: done getting variables 18714 1726853413.36755: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 13:30:13 -0400 (0:00:00.039) 0:00:09.751 ****** 18714 1726853413.36775: entering _queue_task() for managed_node1/gather_facts 18714 1726853413.36946: worker is 1 (out of 1 available) 18714 1726853413.36958: exiting _queue_task() for managed_node1/gather_facts 18714 1726853413.36970: done queuing things up, now waiting for results queue to drain 18714 1726853413.36972: waiting for pending results... 18714 1726853413.37125: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853413.37183: in run() - task 02083763-bbaf-e784-4f7d-000000000237 18714 1726853413.37203: variable 'ansible_search_path' from source: unknown 18714 1726853413.37227: calling self._execute() 18714 1726853413.37293: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853413.37297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853413.37308: variable 'omit' from source: magic vars 18714 1726853413.37562: variable 'ansible_distribution_major_version' from source: facts 18714 1726853413.37570: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853413.37577: variable 'omit' from source: magic vars 18714 1726853413.37594: variable 'omit' from source: magic vars 18714 1726853413.37618: variable 'omit' from source: magic vars 18714 1726853413.37649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853413.37678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853413.37699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853413.37726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853413.37777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853413.37780: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853413.37782: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853413.37794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853413.37976: Set connection var ansible_shell_executable to /bin/sh 18714 1726853413.37979: Set connection var ansible_timeout to 10 18714 1726853413.37981: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853413.37984: Set connection var ansible_connection to ssh 18714 1726853413.37986: Set connection var ansible_shell_type to sh 18714 1726853413.37988: Set connection var ansible_pipelining to False 18714 1726853413.37990: variable 'ansible_shell_executable' from source: unknown 18714 1726853413.37992: variable 'ansible_connection' from source: unknown 18714 1726853413.37994: variable 'ansible_module_compression' from source: unknown 18714 1726853413.37996: variable 'ansible_shell_type' from source: unknown 18714 1726853413.37997: variable 'ansible_shell_executable' from source: unknown 18714 1726853413.38001: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853413.38003: variable 'ansible_pipelining' from source: unknown 18714 1726853413.38004: variable 'ansible_timeout' from source: unknown 18714 1726853413.38006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853413.38179: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853413.38194: variable 'omit' from source: magic vars 18714 1726853413.38204: starting attempt loop 18714 1726853413.38211: running the handler 18714 1726853413.38377: variable 'ansible_facts' from source: unknown 18714 1726853413.38380: _low_level_execute_command(): starting 18714 1726853413.38381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853413.38941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.38954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853413.38986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.39050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.39096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.39113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.39133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.39205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.40827: stdout chunk (state=3): >>>/root <<< 18714 1726853413.41061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.41075: stdout chunk (state=3): >>><<< 18714 1726853413.41099: stderr chunk (state=3): >>><<< 18714 1726853413.41126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.41225: _low_level_execute_command(): starting 18714 1726853413.41229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031 `" && echo ansible-tmp-1726853413.411332-19184-163092519357031="` echo /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031 `" ) && sleep 0' 18714 1726853413.41788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853413.41868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.41892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.41927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.41993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.43865: stdout chunk (state=3): >>>ansible-tmp-1726853413.411332-19184-163092519357031=/root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031 <<< 18714 1726853413.44013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.44024: stderr chunk (state=3): >>><<< 18714 1726853413.44035: stdout chunk (state=3): >>><<< 18714 1726853413.44077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853413.411332-19184-163092519357031=/root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.44107: variable 'ansible_module_compression' from source: unknown 18714 1726853413.44280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853413.44283: variable 'ansible_facts' from source: unknown 18714 1726853413.44455: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py 18714 1726853413.44623: Sending initial data 18714 1726853413.44633: Sent initial data (153 bytes) 18714 1726853413.45266: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.45377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.45399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.45416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.45494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.47023: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853413.47090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853413.47167: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp1rpsh21g /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py <<< 18714 1726853413.47181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py" <<< 18714 1726853413.47226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp1rpsh21g" to remote "/root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py" <<< 18714 1726853413.48932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.48988: stderr chunk (state=3): >>><<< 18714 1726853413.48991: stdout chunk (state=3): >>><<< 18714 1726853413.48993: done transferring module to remote 18714 1726853413.48995: _low_level_execute_command(): starting 18714 1726853413.48997: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/ /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py && sleep 0' 18714 1726853413.49724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853413.49740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.49802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853413.51616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853413.51619: stdout chunk (state=3): >>><<< 18714 1726853413.51622: stderr chunk (state=3): >>><<< 18714 1726853413.51733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853413.51741: _low_level_execute_command(): starting 18714 1726853413.51744: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/AnsiballZ_setup.py && sleep 0' 18714 1726853413.52325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853413.52341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853413.52359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853413.52379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853413.52491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853413.52538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853413.52581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853414.19160: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc<<< 18714 1726853414.19214: stdout chunk (state=3): >>>2656ce384f6", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2926, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 605, "free": 2926}, "nocache": {"free": 3264, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 580, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794856960, "block_size": 4096, "block_total": 65519099, "block_available": 63914760, "block_used": 1604339, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.427734375, "5m": 0.3603515625, "15m": 0.17236328125}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "14", "epoch": "1726853414", "epoch_int": "1726853414", "date": "2024-09-20", "time": "13:30:14", "iso8601_micro": "2024-09-20T17:30:14.130566Z", "iso8601": "2024-09-20T17:30:14Z", "iso8601_basic": "20240920T133014130566", "iso8601_basic_short": "20240920T133014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "v<<< 18714 1726853414.19285: stdout chunk (state=3): >>>lan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853414.21160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853414.21189: stderr chunk (state=3): >>><<< 18714 1726853414.21192: stdout chunk (state=3): >>><<< 18714 1726853414.21222: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2926, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 605, "free": 2926}, "nocache": {"free": 3264, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 580, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794856960, "block_size": 4096, "block_total": 65519099, "block_available": 63914760, "block_used": 1604339, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.427734375, "5m": 0.3603515625, "15m": 0.17236328125}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "14", "epoch": "1726853414", "epoch_int": "1726853414", "date": "2024-09-20", "time": "13:30:14", "iso8601_micro": "2024-09-20T17:30:14.130566Z", "iso8601": "2024-09-20T17:30:14Z", "iso8601_basic": "20240920T133014130566", "iso8601_basic_short": "20240920T133014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853414.21715: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853414.21729: _low_level_execute_command(): starting 18714 1726853414.21734: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853413.411332-19184-163092519357031/ > /dev/null 2>&1 && sleep 0' 18714 1726853414.22185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853414.22188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.22190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853414.22192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.22244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853414.22247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853414.22252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853414.22294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853414.24088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853414.24112: stderr chunk (state=3): >>><<< 18714 1726853414.24115: stdout chunk (state=3): >>><<< 18714 1726853414.24130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853414.24137: handler run complete 18714 1726853414.24231: variable 'ansible_facts' from source: unknown 18714 1726853414.24302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.24521: variable 'ansible_facts' from source: unknown 18714 1726853414.24583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.24673: attempt loop complete, returning result 18714 1726853414.24676: _execute() done 18714 1726853414.24679: dumping result to json 18714 1726853414.24702: done dumping result, returning 18714 1726853414.24709: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-000000000237] 18714 1726853414.24711: sending task result for task 02083763-bbaf-e784-4f7d-000000000237 18714 1726853414.25238: done sending task result for task 02083763-bbaf-e784-4f7d-000000000237 18714 1726853414.25241: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853414.25416: no more pending results, returning what we have 18714 1726853414.25418: results queue empty 18714 1726853414.25418: checking for any_errors_fatal 18714 1726853414.25419: done checking for any_errors_fatal 18714 1726853414.25420: checking for max_fail_percentage 18714 1726853414.25421: done checking for max_fail_percentage 18714 1726853414.25421: checking to see if all hosts have failed and the running result is not ok 18714 1726853414.25422: done checking to see if all hosts have failed 18714 1726853414.25422: getting the remaining hosts for this loop 18714 1726853414.25423: done getting the remaining hosts for this loop 18714 1726853414.25426: getting the next task for host managed_node1 18714 1726853414.25430: done getting next task for host managed_node1 18714 1726853414.25431: ^ task is: TASK: meta (flush_handlers) 18714 1726853414.25432: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853414.25435: getting variables 18714 1726853414.25435: in VariableManager get_vars() 18714 1726853414.25455: Calling all_inventory to load vars for managed_node1 18714 1726853414.25457: Calling groups_inventory to load vars for managed_node1 18714 1726853414.25458: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.25465: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.25467: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.25468: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.25573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.25713: done with get_vars() 18714 1726853414.25719: done getting variables 18714 1726853414.25769: in VariableManager get_vars() 18714 1726853414.25778: Calling all_inventory to load vars for managed_node1 18714 1726853414.25780: Calling groups_inventory to load vars for managed_node1 18714 1726853414.25781: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.25784: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.25785: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.25787: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.25878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.26000: done with get_vars() 18714 1726853414.26008: done queuing things up, now waiting for results queue to drain 18714 1726853414.26010: results queue empty 18714 1726853414.26010: checking for any_errors_fatal 18714 1726853414.26012: done checking for any_errors_fatal 18714 1726853414.26012: checking for max_fail_percentage 18714 1726853414.26016: done checking for max_fail_percentage 18714 1726853414.26017: checking to see if all hosts have failed and the running result is not ok 18714 1726853414.26017: done checking to see if all hosts have failed 18714 1726853414.26018: getting the remaining hosts for this loop 18714 1726853414.26018: done getting the remaining hosts for this loop 18714 1726853414.26020: getting the next task for host managed_node1 18714 1726853414.26022: done getting next task for host managed_node1 18714 1726853414.26024: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18714 1726853414.26025: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853414.26031: getting variables 18714 1726853414.26032: in VariableManager get_vars() 18714 1726853414.26040: Calling all_inventory to load vars for managed_node1 18714 1726853414.26041: Calling groups_inventory to load vars for managed_node1 18714 1726853414.26042: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.26045: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.26046: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.26048: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.26137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.26285: done with get_vars() 18714 1726853414.26291: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:30:14 -0400 (0:00:00.895) 0:00:10.646 ****** 18714 1726853414.26335: entering _queue_task() for managed_node1/include_tasks 18714 1726853414.26534: worker is 1 (out of 1 available) 18714 1726853414.26547: exiting _queue_task() for managed_node1/include_tasks 18714 1726853414.26561: done queuing things up, now waiting for results queue to drain 18714 1726853414.26562: waiting for pending results... 18714 1726853414.26721: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18714 1726853414.26786: in run() - task 02083763-bbaf-e784-4f7d-000000000019 18714 1726853414.26799: variable 'ansible_search_path' from source: unknown 18714 1726853414.26803: variable 'ansible_search_path' from source: unknown 18714 1726853414.26833: calling self._execute() 18714 1726853414.26908: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.26911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.26918: variable 'omit' from source: magic vars 18714 1726853414.27190: variable 'ansible_distribution_major_version' from source: facts 18714 1726853414.27199: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853414.27205: _execute() done 18714 1726853414.27207: dumping result to json 18714 1726853414.27212: done dumping result, returning 18714 1726853414.27219: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-e784-4f7d-000000000019] 18714 1726853414.27221: sending task result for task 02083763-bbaf-e784-4f7d-000000000019 18714 1726853414.27306: done sending task result for task 02083763-bbaf-e784-4f7d-000000000019 18714 1726853414.27309: WORKER PROCESS EXITING 18714 1726853414.27378: no more pending results, returning what we have 18714 1726853414.27382: in VariableManager get_vars() 18714 1726853414.27414: Calling all_inventory to load vars for managed_node1 18714 1726853414.27416: Calling groups_inventory to load vars for managed_node1 18714 1726853414.27419: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.27427: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.27430: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.27433: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.27560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.27700: done with get_vars() 18714 1726853414.27706: variable 'ansible_search_path' from source: unknown 18714 1726853414.27706: variable 'ansible_search_path' from source: unknown 18714 1726853414.27724: we have included files to process 18714 1726853414.27725: generating all_blocks data 18714 1726853414.27726: done generating all_blocks data 18714 1726853414.27726: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853414.27727: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853414.27728: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853414.28199: done processing included file 18714 1726853414.28201: iterating over new_blocks loaded from include file 18714 1726853414.28202: in VariableManager get_vars() 18714 1726853414.28215: done with get_vars() 18714 1726853414.28217: filtering new block on tags 18714 1726853414.28227: done filtering new block on tags 18714 1726853414.28228: in VariableManager get_vars() 18714 1726853414.28239: done with get_vars() 18714 1726853414.28240: filtering new block on tags 18714 1726853414.28252: done filtering new block on tags 18714 1726853414.28253: in VariableManager get_vars() 18714 1726853414.28264: done with get_vars() 18714 1726853414.28264: filtering new block on tags 18714 1726853414.28275: done filtering new block on tags 18714 1726853414.28276: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18714 1726853414.28279: extending task lists for all hosts with included blocks 18714 1726853414.28481: done extending task lists 18714 1726853414.28482: done processing included files 18714 1726853414.28483: results queue empty 18714 1726853414.28483: checking for any_errors_fatal 18714 1726853414.28484: done checking for any_errors_fatal 18714 1726853414.28484: checking for max_fail_percentage 18714 1726853414.28485: done checking for max_fail_percentage 18714 1726853414.28486: checking to see if all hosts have failed and the running result is not ok 18714 1726853414.28486: done checking to see if all hosts have failed 18714 1726853414.28487: getting the remaining hosts for this loop 18714 1726853414.28487: done getting the remaining hosts for this loop 18714 1726853414.28489: getting the next task for host managed_node1 18714 1726853414.28491: done getting next task for host managed_node1 18714 1726853414.28492: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18714 1726853414.28494: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853414.28500: getting variables 18714 1726853414.28500: in VariableManager get_vars() 18714 1726853414.28509: Calling all_inventory to load vars for managed_node1 18714 1726853414.28510: Calling groups_inventory to load vars for managed_node1 18714 1726853414.28511: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.28514: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.28515: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.28517: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.28623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.28760: done with get_vars() 18714 1726853414.28766: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:30:14 -0400 (0:00:00.024) 0:00:10.671 ****** 18714 1726853414.28812: entering _queue_task() for managed_node1/setup 18714 1726853414.28994: worker is 1 (out of 1 available) 18714 1726853414.29008: exiting _queue_task() for managed_node1/setup 18714 1726853414.29018: done queuing things up, now waiting for results queue to drain 18714 1726853414.29019: waiting for pending results... 18714 1726853414.29165: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18714 1726853414.29231: in run() - task 02083763-bbaf-e784-4f7d-000000000279 18714 1726853414.29246: variable 'ansible_search_path' from source: unknown 18714 1726853414.29251: variable 'ansible_search_path' from source: unknown 18714 1726853414.29276: calling self._execute() 18714 1726853414.29334: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.29338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.29348: variable 'omit' from source: magic vars 18714 1726853414.29600: variable 'ansible_distribution_major_version' from source: facts 18714 1726853414.29609: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853414.29745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853414.31776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853414.31780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853414.31810: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853414.31849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853414.31882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853414.31966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853414.32004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853414.32034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853414.32083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853414.32106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853414.32164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853414.32203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853414.32234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853414.32282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853414.32304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853414.32477: variable '__network_required_facts' from source: role '' defaults 18714 1726853414.32481: variable 'ansible_facts' from source: unknown 18714 1726853414.32586: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18714 1726853414.32677: when evaluation is False, skipping this task 18714 1726853414.32681: _execute() done 18714 1726853414.32683: dumping result to json 18714 1726853414.32686: done dumping result, returning 18714 1726853414.32689: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-e784-4f7d-000000000279] 18714 1726853414.32691: sending task result for task 02083763-bbaf-e784-4f7d-000000000279 18714 1726853414.32758: done sending task result for task 02083763-bbaf-e784-4f7d-000000000279 18714 1726853414.32762: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853414.32804: no more pending results, returning what we have 18714 1726853414.32807: results queue empty 18714 1726853414.32808: checking for any_errors_fatal 18714 1726853414.32809: done checking for any_errors_fatal 18714 1726853414.32810: checking for max_fail_percentage 18714 1726853414.32811: done checking for max_fail_percentage 18714 1726853414.32812: checking to see if all hosts have failed and the running result is not ok 18714 1726853414.32812: done checking to see if all hosts have failed 18714 1726853414.32813: getting the remaining hosts for this loop 18714 1726853414.32815: done getting the remaining hosts for this loop 18714 1726853414.32818: getting the next task for host managed_node1 18714 1726853414.32827: done getting next task for host managed_node1 18714 1726853414.32830: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18714 1726853414.32833: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853414.32845: getting variables 18714 1726853414.32847: in VariableManager get_vars() 18714 1726853414.32889: Calling all_inventory to load vars for managed_node1 18714 1726853414.32892: Calling groups_inventory to load vars for managed_node1 18714 1726853414.32894: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.32904: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.32907: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.32909: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.33164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.33399: done with get_vars() 18714 1726853414.33410: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:30:14 -0400 (0:00:00.046) 0:00:10.718 ****** 18714 1726853414.33505: entering _queue_task() for managed_node1/stat 18714 1726853414.33758: worker is 1 (out of 1 available) 18714 1726853414.33770: exiting _queue_task() for managed_node1/stat 18714 1726853414.33982: done queuing things up, now waiting for results queue to drain 18714 1726853414.33984: waiting for pending results... 18714 1726853414.34111: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18714 1726853414.34176: in run() - task 02083763-bbaf-e784-4f7d-00000000027b 18714 1726853414.34200: variable 'ansible_search_path' from source: unknown 18714 1726853414.34214: variable 'ansible_search_path' from source: unknown 18714 1726853414.34255: calling self._execute() 18714 1726853414.34348: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.34362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.34426: variable 'omit' from source: magic vars 18714 1726853414.34845: variable 'ansible_distribution_major_version' from source: facts 18714 1726853414.34868: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853414.35039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853414.35281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853414.35314: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853414.35339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853414.35365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853414.35432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853414.35452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853414.35469: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853414.35489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853414.35552: variable '__network_is_ostree' from source: set_fact 18714 1726853414.35556: Evaluated conditional (not __network_is_ostree is defined): False 18714 1726853414.35559: when evaluation is False, skipping this task 18714 1726853414.35561: _execute() done 18714 1726853414.35563: dumping result to json 18714 1726853414.35566: done dumping result, returning 18714 1726853414.35575: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-e784-4f7d-00000000027b] 18714 1726853414.35578: sending task result for task 02083763-bbaf-e784-4f7d-00000000027b 18714 1726853414.35660: done sending task result for task 02083763-bbaf-e784-4f7d-00000000027b 18714 1726853414.35662: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18714 1726853414.35713: no more pending results, returning what we have 18714 1726853414.35716: results queue empty 18714 1726853414.35717: checking for any_errors_fatal 18714 1726853414.35722: done checking for any_errors_fatal 18714 1726853414.35723: checking for max_fail_percentage 18714 1726853414.35724: done checking for max_fail_percentage 18714 1726853414.35725: checking to see if all hosts have failed and the running result is not ok 18714 1726853414.35726: done checking to see if all hosts have failed 18714 1726853414.35726: getting the remaining hosts for this loop 18714 1726853414.35727: done getting the remaining hosts for this loop 18714 1726853414.35731: getting the next task for host managed_node1 18714 1726853414.35737: done getting next task for host managed_node1 18714 1726853414.35740: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18714 1726853414.35743: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853414.35757: getting variables 18714 1726853414.35759: in VariableManager get_vars() 18714 1726853414.35791: Calling all_inventory to load vars for managed_node1 18714 1726853414.35794: Calling groups_inventory to load vars for managed_node1 18714 1726853414.35796: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.35803: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.35805: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.35808: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.36008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.36143: done with get_vars() 18714 1726853414.36152: done getting variables 18714 1726853414.36195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:30:14 -0400 (0:00:00.027) 0:00:10.745 ****** 18714 1726853414.36218: entering _queue_task() for managed_node1/set_fact 18714 1726853414.36399: worker is 1 (out of 1 available) 18714 1726853414.36412: exiting _queue_task() for managed_node1/set_fact 18714 1726853414.36423: done queuing things up, now waiting for results queue to drain 18714 1726853414.36424: waiting for pending results... 18714 1726853414.36584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18714 1726853414.36657: in run() - task 02083763-bbaf-e784-4f7d-00000000027c 18714 1726853414.36667: variable 'ansible_search_path' from source: unknown 18714 1726853414.36670: variable 'ansible_search_path' from source: unknown 18714 1726853414.36699: calling self._execute() 18714 1726853414.36767: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.36770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.36781: variable 'omit' from source: magic vars 18714 1726853414.37038: variable 'ansible_distribution_major_version' from source: facts 18714 1726853414.37047: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853414.37161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853414.37350: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853414.37383: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853414.37409: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853414.37463: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853414.37543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853414.37776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853414.37780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853414.37783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853414.37786: variable '__network_is_ostree' from source: set_fact 18714 1726853414.37788: Evaluated conditional (not __network_is_ostree is defined): False 18714 1726853414.37790: when evaluation is False, skipping this task 18714 1726853414.37792: _execute() done 18714 1726853414.37794: dumping result to json 18714 1726853414.37796: done dumping result, returning 18714 1726853414.37800: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-e784-4f7d-00000000027c] 18714 1726853414.37803: sending task result for task 02083763-bbaf-e784-4f7d-00000000027c 18714 1726853414.37869: done sending task result for task 02083763-bbaf-e784-4f7d-00000000027c 18714 1726853414.37875: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18714 1726853414.38027: no more pending results, returning what we have 18714 1726853414.38030: results queue empty 18714 1726853414.38031: checking for any_errors_fatal 18714 1726853414.38036: done checking for any_errors_fatal 18714 1726853414.38036: checking for max_fail_percentage 18714 1726853414.38038: done checking for max_fail_percentage 18714 1726853414.38039: checking to see if all hosts have failed and the running result is not ok 18714 1726853414.38039: done checking to see if all hosts have failed 18714 1726853414.38040: getting the remaining hosts for this loop 18714 1726853414.38041: done getting the remaining hosts for this loop 18714 1726853414.38044: getting the next task for host managed_node1 18714 1726853414.38051: done getting next task for host managed_node1 18714 1726853414.38054: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18714 1726853414.38056: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853414.38067: getting variables 18714 1726853414.38068: in VariableManager get_vars() 18714 1726853414.38101: Calling all_inventory to load vars for managed_node1 18714 1726853414.38104: Calling groups_inventory to load vars for managed_node1 18714 1726853414.38105: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853414.38113: Calling all_plugins_play to load vars for managed_node1 18714 1726853414.38116: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853414.38118: Calling groups_plugins_play to load vars for managed_node1 18714 1726853414.38326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853414.38663: done with get_vars() 18714 1726853414.38675: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:30:14 -0400 (0:00:00.025) 0:00:10.770 ****** 18714 1726853414.38738: entering _queue_task() for managed_node1/service_facts 18714 1726853414.38739: Creating lock for service_facts 18714 1726853414.38941: worker is 1 (out of 1 available) 18714 1726853414.38956: exiting _queue_task() for managed_node1/service_facts 18714 1726853414.38967: done queuing things up, now waiting for results queue to drain 18714 1726853414.38968: waiting for pending results... 18714 1726853414.39128: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18714 1726853414.39201: in run() - task 02083763-bbaf-e784-4f7d-00000000027e 18714 1726853414.39212: variable 'ansible_search_path' from source: unknown 18714 1726853414.39216: variable 'ansible_search_path' from source: unknown 18714 1726853414.39243: calling self._execute() 18714 1726853414.39308: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.39312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.39320: variable 'omit' from source: magic vars 18714 1726853414.39576: variable 'ansible_distribution_major_version' from source: facts 18714 1726853414.39586: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853414.39591: variable 'omit' from source: magic vars 18714 1726853414.39623: variable 'omit' from source: magic vars 18714 1726853414.39650: variable 'omit' from source: magic vars 18714 1726853414.39682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853414.39707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853414.39723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853414.39737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853414.39748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853414.39774: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853414.39778: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.39780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.39845: Set connection var ansible_shell_executable to /bin/sh 18714 1726853414.39850: Set connection var ansible_timeout to 10 18714 1726853414.39861: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853414.39866: Set connection var ansible_connection to ssh 18714 1726853414.39872: Set connection var ansible_shell_type to sh 18714 1726853414.39877: Set connection var ansible_pipelining to False 18714 1726853414.39893: variable 'ansible_shell_executable' from source: unknown 18714 1726853414.39895: variable 'ansible_connection' from source: unknown 18714 1726853414.39898: variable 'ansible_module_compression' from source: unknown 18714 1726853414.39900: variable 'ansible_shell_type' from source: unknown 18714 1726853414.39903: variable 'ansible_shell_executable' from source: unknown 18714 1726853414.39909: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853414.39911: variable 'ansible_pipelining' from source: unknown 18714 1726853414.39913: variable 'ansible_timeout' from source: unknown 18714 1726853414.39915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853414.40058: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853414.40066: variable 'omit' from source: magic vars 18714 1726853414.40076: starting attempt loop 18714 1726853414.40078: running the handler 18714 1726853414.40089: _low_level_execute_command(): starting 18714 1726853414.40097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853414.40627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853414.40631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853414.40635: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.40723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853414.40767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853414.42451: stdout chunk (state=3): >>>/root <<< 18714 1726853414.42601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853414.42605: stdout chunk (state=3): >>><<< 18714 1726853414.42607: stderr chunk (state=3): >>><<< 18714 1726853414.42718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853414.42722: _low_level_execute_command(): starting 18714 1726853414.42726: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516 `" && echo ansible-tmp-1726853414.4262958-19224-276254990468516="` echo /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516 `" ) && sleep 0' 18714 1726853414.43288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.43366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853414.43400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853414.43415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853414.43526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853414.45405: stdout chunk (state=3): >>>ansible-tmp-1726853414.4262958-19224-276254990468516=/root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516 <<< 18714 1726853414.45574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853414.45578: stdout chunk (state=3): >>><<< 18714 1726853414.45580: stderr chunk (state=3): >>><<< 18714 1726853414.45677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853414.4262958-19224-276254990468516=/root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853414.45680: variable 'ansible_module_compression' from source: unknown 18714 1726853414.45714: ANSIBALLZ: Using lock for service_facts 18714 1726853414.45722: ANSIBALLZ: Acquiring lock 18714 1726853414.45729: ANSIBALLZ: Lock acquired: 139791966563424 18714 1726853414.45737: ANSIBALLZ: Creating module 18714 1726853414.55492: ANSIBALLZ: Writing module into payload 18714 1726853414.55586: ANSIBALLZ: Writing module 18714 1726853414.55616: ANSIBALLZ: Renaming module 18714 1726853414.55629: ANSIBALLZ: Done creating module 18714 1726853414.55650: variable 'ansible_facts' from source: unknown 18714 1726853414.55723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py 18714 1726853414.55938: Sending initial data 18714 1726853414.55948: Sent initial data (162 bytes) 18714 1726853414.56549: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853414.56553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.56599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853414.56613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853414.56676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853414.58284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18714 1726853414.58291: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853414.58339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853414.58399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpe5qi1m2k /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py <<< 18714 1726853414.58402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py" <<< 18714 1726853414.58462: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpe5qi1m2k" to remote "/root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py" <<< 18714 1726853414.59205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853414.59277: stderr chunk (state=3): >>><<< 18714 1726853414.59281: stdout chunk (state=3): >>><<< 18714 1726853414.59283: done transferring module to remote 18714 1726853414.59285: _low_level_execute_command(): starting 18714 1726853414.59287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/ /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py && sleep 0' 18714 1726853414.59669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853414.59676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853414.59703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.59706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853414.59708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853414.59710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.59764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853414.59780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853414.59808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853414.61567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853414.61591: stderr chunk (state=3): >>><<< 18714 1726853414.61594: stdout chunk (state=3): >>><<< 18714 1726853414.61606: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853414.61609: _low_level_execute_command(): starting 18714 1726853414.61614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/AnsiballZ_service_facts.py && sleep 0' 18714 1726853414.62012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853414.62015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.62018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853414.62020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853414.62022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853414.62067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853414.62075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853414.62116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853416.15313: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18714 1726853416.16932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853416.16937: stdout chunk (state=3): >>><<< 18714 1726853416.16940: stderr chunk (state=3): >>><<< 18714 1726853416.16944: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853416.21706: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853416.21713: _low_level_execute_command(): starting 18714 1726853416.21718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853414.4262958-19224-276254990468516/ > /dev/null 2>&1 && sleep 0' 18714 1726853416.23125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853416.23213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853416.23240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853416.23265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853416.23366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853416.25337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853416.25341: stdout chunk (state=3): >>><<< 18714 1726853416.25344: stderr chunk (state=3): >>><<< 18714 1726853416.25463: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853416.25467: handler run complete 18714 1726853416.25883: variable 'ansible_facts' from source: unknown 18714 1726853416.26164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853416.27105: variable 'ansible_facts' from source: unknown 18714 1726853416.27660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853416.27851: attempt loop complete, returning result 18714 1726853416.27887: _execute() done 18714 1726853416.28076: dumping result to json 18714 1726853416.28079: done dumping result, returning 18714 1726853416.28081: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-e784-4f7d-00000000027e] 18714 1726853416.28083: sending task result for task 02083763-bbaf-e784-4f7d-00000000027e 18714 1726853416.29753: done sending task result for task 02083763-bbaf-e784-4f7d-00000000027e 18714 1726853416.29756: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853416.29816: no more pending results, returning what we have 18714 1726853416.29818: results queue empty 18714 1726853416.29820: checking for any_errors_fatal 18714 1726853416.29825: done checking for any_errors_fatal 18714 1726853416.29826: checking for max_fail_percentage 18714 1726853416.29828: done checking for max_fail_percentage 18714 1726853416.29829: checking to see if all hosts have failed and the running result is not ok 18714 1726853416.29830: done checking to see if all hosts have failed 18714 1726853416.29831: getting the remaining hosts for this loop 18714 1726853416.29832: done getting the remaining hosts for this loop 18714 1726853416.29836: getting the next task for host managed_node1 18714 1726853416.29842: done getting next task for host managed_node1 18714 1726853416.29846: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18714 1726853416.29849: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853416.29858: getting variables 18714 1726853416.29860: in VariableManager get_vars() 18714 1726853416.29901: Calling all_inventory to load vars for managed_node1 18714 1726853416.29904: Calling groups_inventory to load vars for managed_node1 18714 1726853416.29906: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853416.29917: Calling all_plugins_play to load vars for managed_node1 18714 1726853416.29920: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853416.29923: Calling groups_plugins_play to load vars for managed_node1 18714 1726853416.31242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853416.32398: done with get_vars() 18714 1726853416.32412: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:30:16 -0400 (0:00:01.938) 0:00:12.709 ****** 18714 1726853416.32626: entering _queue_task() for managed_node1/package_facts 18714 1726853416.32628: Creating lock for package_facts 18714 1726853416.33403: worker is 1 (out of 1 available) 18714 1726853416.33417: exiting _queue_task() for managed_node1/package_facts 18714 1726853416.33431: done queuing things up, now waiting for results queue to drain 18714 1726853416.33432: waiting for pending results... 18714 1726853416.34589: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18714 1726853416.34594: in run() - task 02083763-bbaf-e784-4f7d-00000000027f 18714 1726853416.34597: variable 'ansible_search_path' from source: unknown 18714 1726853416.34600: variable 'ansible_search_path' from source: unknown 18714 1726853416.34602: calling self._execute() 18714 1726853416.34761: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853416.34802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853416.34977: variable 'omit' from source: magic vars 18714 1726853416.35767: variable 'ansible_distribution_major_version' from source: facts 18714 1726853416.35770: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853416.35775: variable 'omit' from source: magic vars 18714 1726853416.35777: variable 'omit' from source: magic vars 18714 1726853416.35893: variable 'omit' from source: magic vars 18714 1726853416.35936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853416.36176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853416.36179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853416.36182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853416.36184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853416.36187: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853416.36189: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853416.36190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853416.36395: Set connection var ansible_shell_executable to /bin/sh 18714 1726853416.36407: Set connection var ansible_timeout to 10 18714 1726853416.36484: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853416.36497: Set connection var ansible_connection to ssh 18714 1726853416.36506: Set connection var ansible_shell_type to sh 18714 1726853416.36514: Set connection var ansible_pipelining to False 18714 1726853416.36543: variable 'ansible_shell_executable' from source: unknown 18714 1726853416.36551: variable 'ansible_connection' from source: unknown 18714 1726853416.36637: variable 'ansible_module_compression' from source: unknown 18714 1726853416.36740: variable 'ansible_shell_type' from source: unknown 18714 1726853416.36743: variable 'ansible_shell_executable' from source: unknown 18714 1726853416.36745: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853416.36747: variable 'ansible_pipelining' from source: unknown 18714 1726853416.36749: variable 'ansible_timeout' from source: unknown 18714 1726853416.36751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853416.37076: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853416.37094: variable 'omit' from source: magic vars 18714 1726853416.37104: starting attempt loop 18714 1726853416.37110: running the handler 18714 1726853416.37127: _low_level_execute_command(): starting 18714 1726853416.37138: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853416.38658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853416.38679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853416.38885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853416.38990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853416.39063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853416.40936: stdout chunk (state=3): >>>/root <<< 18714 1726853416.40985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853416.41017: stderr chunk (state=3): >>><<< 18714 1726853416.41040: stdout chunk (state=3): >>><<< 18714 1726853416.41101: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853416.41122: _low_level_execute_command(): starting 18714 1726853416.41188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593 `" && echo ansible-tmp-1726853416.4110818-19311-239154662779593="` echo /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593 `" ) && sleep 0' 18714 1726853416.42390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853416.42501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853416.42516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853416.42534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853416.42688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853416.42745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853416.42770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853416.42848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853416.42889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853416.44846: stdout chunk (state=3): >>>ansible-tmp-1726853416.4110818-19311-239154662779593=/root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593 <<< 18714 1726853416.45082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853416.45105: stderr chunk (state=3): >>><<< 18714 1726853416.45108: stdout chunk (state=3): >>><<< 18714 1726853416.45129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853416.4110818-19311-239154662779593=/root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853416.45220: variable 'ansible_module_compression' from source: unknown 18714 1726853416.45435: ANSIBALLZ: Using lock for package_facts 18714 1726853416.45438: ANSIBALLZ: Acquiring lock 18714 1726853416.45440: ANSIBALLZ: Lock acquired: 139791966557328 18714 1726853416.45442: ANSIBALLZ: Creating module 18714 1726853417.10178: ANSIBALLZ: Writing module into payload 18714 1726853417.10383: ANSIBALLZ: Writing module 18714 1726853417.10461: ANSIBALLZ: Renaming module 18714 1726853417.10468: ANSIBALLZ: Done creating module 18714 1726853417.10491: variable 'ansible_facts' from source: unknown 18714 1726853417.10951: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py 18714 1726853417.11477: Sending initial data 18714 1726853417.11480: Sent initial data (162 bytes) 18714 1726853417.12738: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853417.12750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853417.12772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853417.12781: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853417.12792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.12807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853417.12814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853417.12877: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.12980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853417.12983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853417.13080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853417.14684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853417.14735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853417.14857: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpi245858k /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py <<< 18714 1726853417.14861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py" <<< 18714 1726853417.15078: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpi245858k" to remote "/root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py" <<< 18714 1726853417.17736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853417.17740: stdout chunk (state=3): >>><<< 18714 1726853417.17748: stderr chunk (state=3): >>><<< 18714 1726853417.17854: done transferring module to remote 18714 1726853417.17867: _low_level_execute_command(): starting 18714 1726853417.17873: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/ /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py && sleep 0' 18714 1726853417.19523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853417.19526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.19560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853417.19569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.19759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853417.19779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853417.19913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853417.21761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853417.21786: stderr chunk (state=3): >>><<< 18714 1726853417.21789: stdout chunk (state=3): >>><<< 18714 1726853417.21900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853417.21912: _low_level_execute_command(): starting 18714 1726853417.21919: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/AnsiballZ_package_facts.py && sleep 0' 18714 1726853417.23493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853417.23509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853417.23566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.23664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.23711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853417.23824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853417.23912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853417.24141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853417.68492: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18714 1726853417.70258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853417.70330: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 18714 1726853417.70364: stdout chunk (state=3): >>><<< 18714 1726853417.70367: stderr chunk (state=3): >>><<< 18714 1726853417.70480: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853417.77879: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853417.77908: _low_level_execute_command(): starting 18714 1726853417.77917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853416.4110818-19311-239154662779593/ > /dev/null 2>&1 && sleep 0' 18714 1726853417.78534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853417.78539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853417.78580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853417.78596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853417.78680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853417.80644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853417.80648: stdout chunk (state=3): >>><<< 18714 1726853417.80650: stderr chunk (state=3): >>><<< 18714 1726853417.80652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853417.80655: handler run complete 18714 1726853417.81177: variable 'ansible_facts' from source: unknown 18714 1726853417.81420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853417.82645: variable 'ansible_facts' from source: unknown 18714 1726853417.83122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853417.84138: attempt loop complete, returning result 18714 1726853417.84153: _execute() done 18714 1726853417.84156: dumping result to json 18714 1726853417.84326: done dumping result, returning 18714 1726853417.84333: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-e784-4f7d-00000000027f] 18714 1726853417.84336: sending task result for task 02083763-bbaf-e784-4f7d-00000000027f 18714 1726853417.85674: done sending task result for task 02083763-bbaf-e784-4f7d-00000000027f 18714 1726853417.85678: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853417.85720: no more pending results, returning what we have 18714 1726853417.85722: results queue empty 18714 1726853417.85722: checking for any_errors_fatal 18714 1726853417.85726: done checking for any_errors_fatal 18714 1726853417.85726: checking for max_fail_percentage 18714 1726853417.85728: done checking for max_fail_percentage 18714 1726853417.85728: checking to see if all hosts have failed and the running result is not ok 18714 1726853417.85729: done checking to see if all hosts have failed 18714 1726853417.85729: getting the remaining hosts for this loop 18714 1726853417.85730: done getting the remaining hosts for this loop 18714 1726853417.85732: getting the next task for host managed_node1 18714 1726853417.85737: done getting next task for host managed_node1 18714 1726853417.85739: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18714 1726853417.85740: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853417.85746: getting variables 18714 1726853417.85747: in VariableManager get_vars() 18714 1726853417.85768: Calling all_inventory to load vars for managed_node1 18714 1726853417.85770: Calling groups_inventory to load vars for managed_node1 18714 1726853417.85774: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853417.85780: Calling all_plugins_play to load vars for managed_node1 18714 1726853417.85782: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853417.85784: Calling groups_plugins_play to load vars for managed_node1 18714 1726853417.86790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853417.88125: done with get_vars() 18714 1726853417.88145: done getting variables 18714 1726853417.88191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:30:17 -0400 (0:00:01.555) 0:00:14.265 ****** 18714 1726853417.88213: entering _queue_task() for managed_node1/debug 18714 1726853417.88454: worker is 1 (out of 1 available) 18714 1726853417.88468: exiting _queue_task() for managed_node1/debug 18714 1726853417.88482: done queuing things up, now waiting for results queue to drain 18714 1726853417.88483: waiting for pending results... 18714 1726853417.88655: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18714 1726853417.88724: in run() - task 02083763-bbaf-e784-4f7d-00000000001a 18714 1726853417.88736: variable 'ansible_search_path' from source: unknown 18714 1726853417.88739: variable 'ansible_search_path' from source: unknown 18714 1726853417.88773: calling self._execute() 18714 1726853417.88844: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853417.88850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853417.88862: variable 'omit' from source: magic vars 18714 1726853417.89143: variable 'ansible_distribution_major_version' from source: facts 18714 1726853417.89158: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853417.89161: variable 'omit' from source: magic vars 18714 1726853417.89188: variable 'omit' from source: magic vars 18714 1726853417.89255: variable 'network_provider' from source: set_fact 18714 1726853417.89273: variable 'omit' from source: magic vars 18714 1726853417.89306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853417.89332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853417.89348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853417.89363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853417.89382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853417.89402: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853417.89405: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853417.89408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853417.89482: Set connection var ansible_shell_executable to /bin/sh 18714 1726853417.89493: Set connection var ansible_timeout to 10 18714 1726853417.89496: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853417.89498: Set connection var ansible_connection to ssh 18714 1726853417.89505: Set connection var ansible_shell_type to sh 18714 1726853417.89508: Set connection var ansible_pipelining to False 18714 1726853417.89524: variable 'ansible_shell_executable' from source: unknown 18714 1726853417.89528: variable 'ansible_connection' from source: unknown 18714 1726853417.89530: variable 'ansible_module_compression' from source: unknown 18714 1726853417.89533: variable 'ansible_shell_type' from source: unknown 18714 1726853417.89535: variable 'ansible_shell_executable' from source: unknown 18714 1726853417.89537: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853417.89539: variable 'ansible_pipelining' from source: unknown 18714 1726853417.89541: variable 'ansible_timeout' from source: unknown 18714 1726853417.89546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853417.89661: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853417.89669: variable 'omit' from source: magic vars 18714 1726853417.89675: starting attempt loop 18714 1726853417.89678: running the handler 18714 1726853417.89776: handler run complete 18714 1726853417.89779: attempt loop complete, returning result 18714 1726853417.89781: _execute() done 18714 1726853417.89783: dumping result to json 18714 1726853417.89785: done dumping result, returning 18714 1726853417.89787: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-e784-4f7d-00000000001a] 18714 1726853417.89789: sending task result for task 02083763-bbaf-e784-4f7d-00000000001a ok: [managed_node1] => {} MSG: Using network provider: nm 18714 1726853417.89922: no more pending results, returning what we have 18714 1726853417.89926: results queue empty 18714 1726853417.89927: checking for any_errors_fatal 18714 1726853417.89938: done checking for any_errors_fatal 18714 1726853417.89939: checking for max_fail_percentage 18714 1726853417.89940: done checking for max_fail_percentage 18714 1726853417.89941: checking to see if all hosts have failed and the running result is not ok 18714 1726853417.89942: done checking to see if all hosts have failed 18714 1726853417.89943: getting the remaining hosts for this loop 18714 1726853417.89944: done getting the remaining hosts for this loop 18714 1726853417.89947: getting the next task for host managed_node1 18714 1726853417.89956: done getting next task for host managed_node1 18714 1726853417.89960: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18714 1726853417.89961: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853417.89970: getting variables 18714 1726853417.89973: in VariableManager get_vars() 18714 1726853417.90007: Calling all_inventory to load vars for managed_node1 18714 1726853417.90010: Calling groups_inventory to load vars for managed_node1 18714 1726853417.90012: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853417.90021: Calling all_plugins_play to load vars for managed_node1 18714 1726853417.90023: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853417.90026: Calling groups_plugins_play to load vars for managed_node1 18714 1726853417.90584: done sending task result for task 02083763-bbaf-e784-4f7d-00000000001a 18714 1726853417.90587: WORKER PROCESS EXITING 18714 1726853417.91340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853417.92525: done with get_vars() 18714 1726853417.92560: done getting variables 18714 1726853417.92620: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:30:17 -0400 (0:00:00.044) 0:00:14.309 ****** 18714 1726853417.92646: entering _queue_task() for managed_node1/fail 18714 1726853417.92905: worker is 1 (out of 1 available) 18714 1726853417.92921: exiting _queue_task() for managed_node1/fail 18714 1726853417.92932: done queuing things up, now waiting for results queue to drain 18714 1726853417.92933: waiting for pending results... 18714 1726853417.93103: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18714 1726853417.93177: in run() - task 02083763-bbaf-e784-4f7d-00000000001b 18714 1726853417.93190: variable 'ansible_search_path' from source: unknown 18714 1726853417.93194: variable 'ansible_search_path' from source: unknown 18714 1726853417.93222: calling self._execute() 18714 1726853417.93296: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853417.93299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853417.93308: variable 'omit' from source: magic vars 18714 1726853417.93842: variable 'ansible_distribution_major_version' from source: facts 18714 1726853417.93884: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853417.93957: variable 'network_state' from source: role '' defaults 18714 1726853417.93992: Evaluated conditional (network_state != {}): False 18714 1726853417.94001: when evaluation is False, skipping this task 18714 1726853417.94004: _execute() done 18714 1726853417.94006: dumping result to json 18714 1726853417.94008: done dumping result, returning 18714 1726853417.94011: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-e784-4f7d-00000000001b] 18714 1726853417.94013: sending task result for task 02083763-bbaf-e784-4f7d-00000000001b 18714 1726853417.94120: done sending task result for task 02083763-bbaf-e784-4f7d-00000000001b 18714 1726853417.94123: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853417.94211: no more pending results, returning what we have 18714 1726853417.94214: results queue empty 18714 1726853417.94216: checking for any_errors_fatal 18714 1726853417.94222: done checking for any_errors_fatal 18714 1726853417.94222: checking for max_fail_percentage 18714 1726853417.94224: done checking for max_fail_percentage 18714 1726853417.94225: checking to see if all hosts have failed and the running result is not ok 18714 1726853417.94226: done checking to see if all hosts have failed 18714 1726853417.94226: getting the remaining hosts for this loop 18714 1726853417.94228: done getting the remaining hosts for this loop 18714 1726853417.94232: getting the next task for host managed_node1 18714 1726853417.94240: done getting next task for host managed_node1 18714 1726853417.94246: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18714 1726853417.94250: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853417.94265: getting variables 18714 1726853417.94267: in VariableManager get_vars() 18714 1726853417.94307: Calling all_inventory to load vars for managed_node1 18714 1726853417.94310: Calling groups_inventory to load vars for managed_node1 18714 1726853417.94312: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853417.94323: Calling all_plugins_play to load vars for managed_node1 18714 1726853417.94325: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853417.94328: Calling groups_plugins_play to load vars for managed_node1 18714 1726853417.95706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853417.96988: done with get_vars() 18714 1726853417.97010: done getting variables 18714 1726853417.97070: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:30:17 -0400 (0:00:00.044) 0:00:14.354 ****** 18714 1726853417.97105: entering _queue_task() for managed_node1/fail 18714 1726853417.97413: worker is 1 (out of 1 available) 18714 1726853417.97424: exiting _queue_task() for managed_node1/fail 18714 1726853417.97435: done queuing things up, now waiting for results queue to drain 18714 1726853417.97436: waiting for pending results... 18714 1726853417.97890: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18714 1726853417.97895: in run() - task 02083763-bbaf-e784-4f7d-00000000001c 18714 1726853417.97898: variable 'ansible_search_path' from source: unknown 18714 1726853417.97901: variable 'ansible_search_path' from source: unknown 18714 1726853417.97904: calling self._execute() 18714 1726853417.97966: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853417.97980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853417.98278: variable 'omit' from source: magic vars 18714 1726853417.98563: variable 'ansible_distribution_major_version' from source: facts 18714 1726853417.98582: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853417.98719: variable 'network_state' from source: role '' defaults 18714 1726853417.98735: Evaluated conditional (network_state != {}): False 18714 1726853417.98743: when evaluation is False, skipping this task 18714 1726853417.98753: _execute() done 18714 1726853417.98762: dumping result to json 18714 1726853417.98773: done dumping result, returning 18714 1726853417.98786: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-e784-4f7d-00000000001c] 18714 1726853417.98796: sending task result for task 02083763-bbaf-e784-4f7d-00000000001c 18714 1726853417.98906: done sending task result for task 02083763-bbaf-e784-4f7d-00000000001c 18714 1726853417.98914: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853417.98982: no more pending results, returning what we have 18714 1726853417.98996: results queue empty 18714 1726853417.98998: checking for any_errors_fatal 18714 1726853417.99007: done checking for any_errors_fatal 18714 1726853417.99008: checking for max_fail_percentage 18714 1726853417.99010: done checking for max_fail_percentage 18714 1726853417.99012: checking to see if all hosts have failed and the running result is not ok 18714 1726853417.99013: done checking to see if all hosts have failed 18714 1726853417.99014: getting the remaining hosts for this loop 18714 1726853417.99016: done getting the remaining hosts for this loop 18714 1726853417.99020: getting the next task for host managed_node1 18714 1726853417.99026: done getting next task for host managed_node1 18714 1726853417.99031: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18714 1726853417.99033: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853417.99047: getting variables 18714 1726853417.99049: in VariableManager get_vars() 18714 1726853417.99088: Calling all_inventory to load vars for managed_node1 18714 1726853417.99091: Calling groups_inventory to load vars for managed_node1 18714 1726853417.99094: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853417.99105: Calling all_plugins_play to load vars for managed_node1 18714 1726853417.99108: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853417.99110: Calling groups_plugins_play to load vars for managed_node1 18714 1726853417.99967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.00863: done with get_vars() 18714 1726853418.00883: done getting variables 18714 1726853418.00926: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:30:18 -0400 (0:00:00.038) 0:00:14.392 ****** 18714 1726853418.00953: entering _queue_task() for managed_node1/fail 18714 1726853418.01187: worker is 1 (out of 1 available) 18714 1726853418.01203: exiting _queue_task() for managed_node1/fail 18714 1726853418.01215: done queuing things up, now waiting for results queue to drain 18714 1726853418.01216: waiting for pending results... 18714 1726853418.01383: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18714 1726853418.01454: in run() - task 02083763-bbaf-e784-4f7d-00000000001d 18714 1726853418.01463: variable 'ansible_search_path' from source: unknown 18714 1726853418.01467: variable 'ansible_search_path' from source: unknown 18714 1726853418.01497: calling self._execute() 18714 1726853418.01567: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.01572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.01582: variable 'omit' from source: magic vars 18714 1726853418.01991: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.01995: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.02138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853418.04605: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853418.04688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853418.04738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853418.04784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853418.04824: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853418.04911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.04954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.04983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.05020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.05046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.05165: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.05189: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18714 1726853418.05326: variable 'ansible_distribution' from source: facts 18714 1726853418.05336: variable '__network_rh_distros' from source: role '' defaults 18714 1726853418.05349: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18714 1726853418.05641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.05689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.05779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.05782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.05795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.05848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.05888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.05924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.05967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.05985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.06033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.06060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.06091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.06147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.06170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.06504: variable 'network_connections' from source: play vars 18714 1726853418.06520: variable 'interface' from source: set_fact 18714 1726853418.06648: variable 'interface' from source: set_fact 18714 1726853418.06654: variable 'interface' from source: set_fact 18714 1726853418.06695: variable 'interface' from source: set_fact 18714 1726853418.06710: variable 'network_state' from source: role '' defaults 18714 1726853418.06793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853418.06974: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853418.07035: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853418.07095: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853418.07119: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853418.07203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853418.07214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853418.07241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.07277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853418.07327: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18714 1726853418.07421: when evaluation is False, skipping this task 18714 1726853418.07425: _execute() done 18714 1726853418.07427: dumping result to json 18714 1726853418.07430: done dumping result, returning 18714 1726853418.07433: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-e784-4f7d-00000000001d] 18714 1726853418.07436: sending task result for task 02083763-bbaf-e784-4f7d-00000000001d 18714 1726853418.07507: done sending task result for task 02083763-bbaf-e784-4f7d-00000000001d 18714 1726853418.07510: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18714 1726853418.07575: no more pending results, returning what we have 18714 1726853418.07579: results queue empty 18714 1726853418.07580: checking for any_errors_fatal 18714 1726853418.07586: done checking for any_errors_fatal 18714 1726853418.07587: checking for max_fail_percentage 18714 1726853418.07589: done checking for max_fail_percentage 18714 1726853418.07589: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.07590: done checking to see if all hosts have failed 18714 1726853418.07591: getting the remaining hosts for this loop 18714 1726853418.07592: done getting the remaining hosts for this loop 18714 1726853418.07596: getting the next task for host managed_node1 18714 1726853418.07604: done getting next task for host managed_node1 18714 1726853418.07608: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18714 1726853418.07610: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.07623: getting variables 18714 1726853418.07624: in VariableManager get_vars() 18714 1726853418.07710: Calling all_inventory to load vars for managed_node1 18714 1726853418.07713: Calling groups_inventory to load vars for managed_node1 18714 1726853418.07715: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.07726: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.07729: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.07732: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.09658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.11368: done with get_vars() 18714 1726853418.11403: done getting variables 18714 1726853418.11517: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:30:18 -0400 (0:00:00.105) 0:00:14.498 ****** 18714 1726853418.11547: entering _queue_task() for managed_node1/dnf 18714 1726853418.12095: worker is 1 (out of 1 available) 18714 1726853418.12105: exiting _queue_task() for managed_node1/dnf 18714 1726853418.12114: done queuing things up, now waiting for results queue to drain 18714 1726853418.12115: waiting for pending results... 18714 1726853418.12357: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18714 1726853418.12363: in run() - task 02083763-bbaf-e784-4f7d-00000000001e 18714 1726853418.12381: variable 'ansible_search_path' from source: unknown 18714 1726853418.12391: variable 'ansible_search_path' from source: unknown 18714 1726853418.12433: calling self._execute() 18714 1726853418.12540: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.12565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.12584: variable 'omit' from source: magic vars 18714 1726853418.13041: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.13061: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.13581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853418.15659: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853418.15708: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853418.15736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853418.15763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853418.15877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853418.15881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.15923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.15927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.15955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.15966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.16110: variable 'ansible_distribution' from source: facts 18714 1726853418.16113: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.16116: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18714 1726853418.16236: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853418.16331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.16364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.16495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.16498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.16501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.16503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.16506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.16511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.16568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.16584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.16626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.16643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.16742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.16746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.16773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.17078: variable 'network_connections' from source: play vars 18714 1726853418.17231: variable 'interface' from source: set_fact 18714 1726853418.17384: variable 'interface' from source: set_fact 18714 1726853418.17387: variable 'interface' from source: set_fact 18714 1726853418.17564: variable 'interface' from source: set_fact 18714 1726853418.17568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853418.17833: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853418.17876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853418.18121: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853418.18150: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853418.18257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853418.18323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853418.18356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.18401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853418.18457: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853418.18725: variable 'network_connections' from source: play vars 18714 1726853418.18732: variable 'interface' from source: set_fact 18714 1726853418.18802: variable 'interface' from source: set_fact 18714 1726853418.18822: variable 'interface' from source: set_fact 18714 1726853418.18864: variable 'interface' from source: set_fact 18714 1726853418.18901: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853418.18905: when evaluation is False, skipping this task 18714 1726853418.18907: _execute() done 18714 1726853418.18910: dumping result to json 18714 1726853418.18912: done dumping result, returning 18714 1726853418.18919: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-00000000001e] 18714 1726853418.18921: sending task result for task 02083763-bbaf-e784-4f7d-00000000001e 18714 1726853418.19014: done sending task result for task 02083763-bbaf-e784-4f7d-00000000001e 18714 1726853418.19016: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853418.19101: no more pending results, returning what we have 18714 1726853418.19104: results queue empty 18714 1726853418.19105: checking for any_errors_fatal 18714 1726853418.19112: done checking for any_errors_fatal 18714 1726853418.19113: checking for max_fail_percentage 18714 1726853418.19114: done checking for max_fail_percentage 18714 1726853418.19115: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.19116: done checking to see if all hosts have failed 18714 1726853418.19116: getting the remaining hosts for this loop 18714 1726853418.19118: done getting the remaining hosts for this loop 18714 1726853418.19121: getting the next task for host managed_node1 18714 1726853418.19128: done getting next task for host managed_node1 18714 1726853418.19132: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18714 1726853418.19134: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.19148: getting variables 18714 1726853418.19150: in VariableManager get_vars() 18714 1726853418.19194: Calling all_inventory to load vars for managed_node1 18714 1726853418.19197: Calling groups_inventory to load vars for managed_node1 18714 1726853418.19198: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.19207: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.19209: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.19211: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.20345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.24100: done with get_vars() 18714 1726853418.24195: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18714 1726853418.24420: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:30:18 -0400 (0:00:00.129) 0:00:14.628 ****** 18714 1726853418.24494: entering _queue_task() for managed_node1/yum 18714 1726853418.24496: Creating lock for yum 18714 1726853418.25303: worker is 1 (out of 1 available) 18714 1726853418.25433: exiting _queue_task() for managed_node1/yum 18714 1726853418.25445: done queuing things up, now waiting for results queue to drain 18714 1726853418.25446: waiting for pending results... 18714 1726853418.25985: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18714 1726853418.26031: in run() - task 02083763-bbaf-e784-4f7d-00000000001f 18714 1726853418.26124: variable 'ansible_search_path' from source: unknown 18714 1726853418.26190: variable 'ansible_search_path' from source: unknown 18714 1726853418.26400: calling self._execute() 18714 1726853418.26522: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.26577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.26581: variable 'omit' from source: magic vars 18714 1726853418.27414: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.27433: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.27880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853418.32881: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853418.33833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853418.34080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853418.34083: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853418.34086: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853418.34244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.34282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.34450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.34499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.34519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.34713: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.34766: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18714 1726853418.34790: when evaluation is False, skipping this task 18714 1726853418.34864: _execute() done 18714 1726853418.34874: dumping result to json 18714 1726853418.34883: done dumping result, returning 18714 1726853418.34900: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-00000000001f] 18714 1726853418.34909: sending task result for task 02083763-bbaf-e784-4f7d-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18714 1726853418.35140: no more pending results, returning what we have 18714 1726853418.35144: results queue empty 18714 1726853418.35145: checking for any_errors_fatal 18714 1726853418.35149: done checking for any_errors_fatal 18714 1726853418.35150: checking for max_fail_percentage 18714 1726853418.35151: done checking for max_fail_percentage 18714 1726853418.35152: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.35153: done checking to see if all hosts have failed 18714 1726853418.35154: getting the remaining hosts for this loop 18714 1726853418.35155: done getting the remaining hosts for this loop 18714 1726853418.35159: getting the next task for host managed_node1 18714 1726853418.35173: done getting next task for host managed_node1 18714 1726853418.35179: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18714 1726853418.35180: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.35195: getting variables 18714 1726853418.35196: in VariableManager get_vars() 18714 1726853418.35234: Calling all_inventory to load vars for managed_node1 18714 1726853418.35236: Calling groups_inventory to load vars for managed_node1 18714 1726853418.35238: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.35248: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.35250: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.35253: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.35998: done sending task result for task 02083763-bbaf-e784-4f7d-00000000001f 18714 1726853418.36001: WORKER PROCESS EXITING 18714 1726853418.47715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.50929: done with get_vars() 18714 1726853418.50979: done getting variables 18714 1726853418.51030: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:30:18 -0400 (0:00:00.266) 0:00:14.895 ****** 18714 1726853418.51164: entering _queue_task() for managed_node1/fail 18714 1726853418.51685: worker is 1 (out of 1 available) 18714 1726853418.51815: exiting _queue_task() for managed_node1/fail 18714 1726853418.51826: done queuing things up, now waiting for results queue to drain 18714 1726853418.51829: waiting for pending results... 18714 1726853418.52025: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18714 1726853418.52164: in run() - task 02083763-bbaf-e784-4f7d-000000000020 18714 1726853418.52251: variable 'ansible_search_path' from source: unknown 18714 1726853418.52254: variable 'ansible_search_path' from source: unknown 18714 1726853418.52258: calling self._execute() 18714 1726853418.52342: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.52361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.52381: variable 'omit' from source: magic vars 18714 1726853418.52815: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.52831: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.52970: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853418.53194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853418.55626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853418.55712: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853418.55841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853418.55844: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853418.55848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853418.55920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.55965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.56001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.56048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.56078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.56129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.56167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.56203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.56378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.56382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.56384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.56386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.56980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.56984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.56987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.57407: variable 'network_connections' from source: play vars 18714 1726853418.57428: variable 'interface' from source: set_fact 18714 1726853418.57632: variable 'interface' from source: set_fact 18714 1726853418.57647: variable 'interface' from source: set_fact 18714 1726853418.57719: variable 'interface' from source: set_fact 18714 1726853418.57952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853418.58313: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853418.58355: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853418.58457: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853418.58543: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853418.58598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853418.58674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853418.58748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.58981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853418.59045: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853418.59552: variable 'network_connections' from source: play vars 18714 1726853418.59695: variable 'interface' from source: set_fact 18714 1726853418.59743: variable 'interface' from source: set_fact 18714 1726853418.59758: variable 'interface' from source: set_fact 18714 1726853418.59891: variable 'interface' from source: set_fact 18714 1726853418.59945: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853418.60179: when evaluation is False, skipping this task 18714 1726853418.60182: _execute() done 18714 1726853418.60185: dumping result to json 18714 1726853418.60187: done dumping result, returning 18714 1726853418.60190: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000020] 18714 1726853418.60199: sending task result for task 02083763-bbaf-e784-4f7d-000000000020 18714 1726853418.60265: done sending task result for task 02083763-bbaf-e784-4f7d-000000000020 18714 1726853418.60268: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853418.60330: no more pending results, returning what we have 18714 1726853418.60333: results queue empty 18714 1726853418.60334: checking for any_errors_fatal 18714 1726853418.60342: done checking for any_errors_fatal 18714 1726853418.60343: checking for max_fail_percentage 18714 1726853418.60345: done checking for max_fail_percentage 18714 1726853418.60345: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.60346: done checking to see if all hosts have failed 18714 1726853418.60347: getting the remaining hosts for this loop 18714 1726853418.60348: done getting the remaining hosts for this loop 18714 1726853418.60351: getting the next task for host managed_node1 18714 1726853418.60360: done getting next task for host managed_node1 18714 1726853418.60364: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18714 1726853418.60366: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.60384: getting variables 18714 1726853418.60386: in VariableManager get_vars() 18714 1726853418.60429: Calling all_inventory to load vars for managed_node1 18714 1726853418.60431: Calling groups_inventory to load vars for managed_node1 18714 1726853418.60436: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.60446: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.60449: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.60455: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.64196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.66167: done with get_vars() 18714 1726853418.66197: done getting variables 18714 1726853418.66255: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:30:18 -0400 (0:00:00.151) 0:00:15.046 ****** 18714 1726853418.66294: entering _queue_task() for managed_node1/package 18714 1726853418.66634: worker is 1 (out of 1 available) 18714 1726853418.66646: exiting _queue_task() for managed_node1/package 18714 1726853418.66657: done queuing things up, now waiting for results queue to drain 18714 1726853418.66658: waiting for pending results... 18714 1726853418.66999: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18714 1726853418.67050: in run() - task 02083763-bbaf-e784-4f7d-000000000021 18714 1726853418.67077: variable 'ansible_search_path' from source: unknown 18714 1726853418.67139: variable 'ansible_search_path' from source: unknown 18714 1726853418.67147: calling self._execute() 18714 1726853418.67261: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.67277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.67360: variable 'omit' from source: magic vars 18714 1726853418.67735: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.67759: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.67990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853418.68303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853418.68360: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853418.68405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853418.68562: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853418.68623: variable 'network_packages' from source: role '' defaults 18714 1726853418.68747: variable '__network_provider_setup' from source: role '' defaults 18714 1726853418.68763: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853418.68846: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853418.68861: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853418.68943: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853418.69161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853418.71749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853418.71827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853418.71923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853418.71926: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853418.71957: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853418.72044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.72087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.72119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.72248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.72254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.72257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.72283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.72311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.72362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.72389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.72653: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18714 1726853418.72785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.72826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.72857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.72908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.73011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.73039: variable 'ansible_python' from source: facts 18714 1726853418.73077: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18714 1726853418.73176: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853418.73268: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853418.73416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.73453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.73489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.73531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.73556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.73663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853418.73677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853418.73681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.73725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853418.73743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853418.73920: variable 'network_connections' from source: play vars 18714 1726853418.73976: variable 'interface' from source: set_fact 18714 1726853418.74048: variable 'interface' from source: set_fact 18714 1726853418.74063: variable 'interface' from source: set_fact 18714 1726853418.74177: variable 'interface' from source: set_fact 18714 1726853418.74264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853418.74299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853418.74344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853418.74383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853418.74477: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853418.74778: variable 'network_connections' from source: play vars 18714 1726853418.74788: variable 'interface' from source: set_fact 18714 1726853418.74918: variable 'interface' from source: set_fact 18714 1726853418.74934: variable 'interface' from source: set_fact 18714 1726853418.75053: variable 'interface' from source: set_fact 18714 1726853418.75276: variable '__network_packages_default_wireless' from source: role '' defaults 18714 1726853418.75279: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853418.75542: variable 'network_connections' from source: play vars 18714 1726853418.75552: variable 'interface' from source: set_fact 18714 1726853418.75642: variable 'interface' from source: set_fact 18714 1726853418.75657: variable 'interface' from source: set_fact 18714 1726853418.75738: variable 'interface' from source: set_fact 18714 1726853418.75766: variable '__network_packages_default_team' from source: role '' defaults 18714 1726853418.75860: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853418.76242: variable 'network_connections' from source: play vars 18714 1726853418.76252: variable 'interface' from source: set_fact 18714 1726853418.76331: variable 'interface' from source: set_fact 18714 1726853418.76342: variable 'interface' from source: set_fact 18714 1726853418.76416: variable 'interface' from source: set_fact 18714 1726853418.76488: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853418.76549: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853418.76592: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853418.76646: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853418.76918: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18714 1726853418.77410: variable 'network_connections' from source: play vars 18714 1726853418.77463: variable 'interface' from source: set_fact 18714 1726853418.77495: variable 'interface' from source: set_fact 18714 1726853418.77506: variable 'interface' from source: set_fact 18714 1726853418.77584: variable 'interface' from source: set_fact 18714 1726853418.77603: variable 'ansible_distribution' from source: facts 18714 1726853418.77613: variable '__network_rh_distros' from source: role '' defaults 18714 1726853418.77684: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.77687: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18714 1726853418.77821: variable 'ansible_distribution' from source: facts 18714 1726853418.77831: variable '__network_rh_distros' from source: role '' defaults 18714 1726853418.77841: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.77861: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18714 1726853418.78042: variable 'ansible_distribution' from source: facts 18714 1726853418.78056: variable '__network_rh_distros' from source: role '' defaults 18714 1726853418.78068: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.78121: variable 'network_provider' from source: set_fact 18714 1726853418.78142: variable 'ansible_facts' from source: unknown 18714 1726853418.78880: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18714 1726853418.79076: when evaluation is False, skipping this task 18714 1726853418.79079: _execute() done 18714 1726853418.79081: dumping result to json 18714 1726853418.79083: done dumping result, returning 18714 1726853418.79085: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-e784-4f7d-000000000021] 18714 1726853418.79087: sending task result for task 02083763-bbaf-e784-4f7d-000000000021 18714 1726853418.79158: done sending task result for task 02083763-bbaf-e784-4f7d-000000000021 18714 1726853418.79161: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18714 1726853418.79210: no more pending results, returning what we have 18714 1726853418.79214: results queue empty 18714 1726853418.79215: checking for any_errors_fatal 18714 1726853418.79221: done checking for any_errors_fatal 18714 1726853418.79221: checking for max_fail_percentage 18714 1726853418.79223: done checking for max_fail_percentage 18714 1726853418.79224: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.79225: done checking to see if all hosts have failed 18714 1726853418.79225: getting the remaining hosts for this loop 18714 1726853418.79227: done getting the remaining hosts for this loop 18714 1726853418.79231: getting the next task for host managed_node1 18714 1726853418.79238: done getting next task for host managed_node1 18714 1726853418.79242: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18714 1726853418.79244: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.79260: getting variables 18714 1726853418.79261: in VariableManager get_vars() 18714 1726853418.79303: Calling all_inventory to load vars for managed_node1 18714 1726853418.79306: Calling groups_inventory to load vars for managed_node1 18714 1726853418.79308: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.79323: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.79326: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.79329: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.81527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.82429: done with get_vars() 18714 1726853418.82444: done getting variables 18714 1726853418.82491: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:30:18 -0400 (0:00:00.162) 0:00:15.208 ****** 18714 1726853418.82512: entering _queue_task() for managed_node1/package 18714 1726853418.82745: worker is 1 (out of 1 available) 18714 1726853418.82758: exiting _queue_task() for managed_node1/package 18714 1726853418.82773: done queuing things up, now waiting for results queue to drain 18714 1726853418.82774: waiting for pending results... 18714 1726853418.83188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18714 1726853418.83194: in run() - task 02083763-bbaf-e784-4f7d-000000000022 18714 1726853418.83197: variable 'ansible_search_path' from source: unknown 18714 1726853418.83200: variable 'ansible_search_path' from source: unknown 18714 1726853418.83203: calling self._execute() 18714 1726853418.83291: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.83301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.83311: variable 'omit' from source: magic vars 18714 1726853418.83888: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.83904: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.84038: variable 'network_state' from source: role '' defaults 18714 1726853418.84053: Evaluated conditional (network_state != {}): False 18714 1726853418.84060: when evaluation is False, skipping this task 18714 1726853418.84100: _execute() done 18714 1726853418.84103: dumping result to json 18714 1726853418.84106: done dumping result, returning 18714 1726853418.84108: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-e784-4f7d-000000000022] 18714 1726853418.84111: sending task result for task 02083763-bbaf-e784-4f7d-000000000022 18714 1726853418.84323: done sending task result for task 02083763-bbaf-e784-4f7d-000000000022 18714 1726853418.84326: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853418.84375: no more pending results, returning what we have 18714 1726853418.84380: results queue empty 18714 1726853418.84381: checking for any_errors_fatal 18714 1726853418.84389: done checking for any_errors_fatal 18714 1726853418.84389: checking for max_fail_percentage 18714 1726853418.84391: done checking for max_fail_percentage 18714 1726853418.84393: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.84393: done checking to see if all hosts have failed 18714 1726853418.84394: getting the remaining hosts for this loop 18714 1726853418.84395: done getting the remaining hosts for this loop 18714 1726853418.84398: getting the next task for host managed_node1 18714 1726853418.84406: done getting next task for host managed_node1 18714 1726853418.84410: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18714 1726853418.84413: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.84588: getting variables 18714 1726853418.84590: in VariableManager get_vars() 18714 1726853418.84626: Calling all_inventory to load vars for managed_node1 18714 1726853418.84629: Calling groups_inventory to load vars for managed_node1 18714 1726853418.84631: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.84643: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.84645: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.84648: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.86442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.89441: done with get_vars() 18714 1726853418.89463: done getting variables 18714 1726853418.89542: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:30:18 -0400 (0:00:00.070) 0:00:15.279 ****** 18714 1726853418.89579: entering _queue_task() for managed_node1/package 18714 1726853418.89923: worker is 1 (out of 1 available) 18714 1726853418.89936: exiting _queue_task() for managed_node1/package 18714 1726853418.89948: done queuing things up, now waiting for results queue to drain 18714 1726853418.89949: waiting for pending results... 18714 1726853418.90317: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18714 1726853418.90403: in run() - task 02083763-bbaf-e784-4f7d-000000000023 18714 1726853418.90410: variable 'ansible_search_path' from source: unknown 18714 1726853418.90447: variable 'ansible_search_path' from source: unknown 18714 1726853418.90487: calling self._execute() 18714 1726853418.90583: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.90587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.90600: variable 'omit' from source: magic vars 18714 1726853418.91076: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.91080: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.91160: variable 'network_state' from source: role '' defaults 18714 1726853418.91169: Evaluated conditional (network_state != {}): False 18714 1726853418.91174: when evaluation is False, skipping this task 18714 1726853418.91178: _execute() done 18714 1726853418.91180: dumping result to json 18714 1726853418.91182: done dumping result, returning 18714 1726853418.91191: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-e784-4f7d-000000000023] 18714 1726853418.91194: sending task result for task 02083763-bbaf-e784-4f7d-000000000023 18714 1726853418.91288: done sending task result for task 02083763-bbaf-e784-4f7d-000000000023 18714 1726853418.91291: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853418.91350: no more pending results, returning what we have 18714 1726853418.91354: results queue empty 18714 1726853418.91355: checking for any_errors_fatal 18714 1726853418.91361: done checking for any_errors_fatal 18714 1726853418.91362: checking for max_fail_percentage 18714 1726853418.91363: done checking for max_fail_percentage 18714 1726853418.91364: checking to see if all hosts have failed and the running result is not ok 18714 1726853418.91365: done checking to see if all hosts have failed 18714 1726853418.91366: getting the remaining hosts for this loop 18714 1726853418.91367: done getting the remaining hosts for this loop 18714 1726853418.91370: getting the next task for host managed_node1 18714 1726853418.91378: done getting next task for host managed_node1 18714 1726853418.91381: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18714 1726853418.91383: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853418.91397: getting variables 18714 1726853418.91399: in VariableManager get_vars() 18714 1726853418.91435: Calling all_inventory to load vars for managed_node1 18714 1726853418.91438: Calling groups_inventory to load vars for managed_node1 18714 1726853418.91440: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853418.91449: Calling all_plugins_play to load vars for managed_node1 18714 1726853418.91452: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853418.91455: Calling groups_plugins_play to load vars for managed_node1 18714 1726853418.93960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853418.95635: done with get_vars() 18714 1726853418.95667: done getting variables 18714 1726853418.95775: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:30:18 -0400 (0:00:00.062) 0:00:15.341 ****** 18714 1726853418.95804: entering _queue_task() for managed_node1/service 18714 1726853418.95806: Creating lock for service 18714 1726853418.96331: worker is 1 (out of 1 available) 18714 1726853418.96343: exiting _queue_task() for managed_node1/service 18714 1726853418.96354: done queuing things up, now waiting for results queue to drain 18714 1726853418.96355: waiting for pending results... 18714 1726853418.97021: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18714 1726853418.97117: in run() - task 02083763-bbaf-e784-4f7d-000000000024 18714 1726853418.97226: variable 'ansible_search_path' from source: unknown 18714 1726853418.97230: variable 'ansible_search_path' from source: unknown 18714 1726853418.97293: calling self._execute() 18714 1726853418.97587: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853418.97592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853418.97597: variable 'omit' from source: magic vars 18714 1726853418.98183: variable 'ansible_distribution_major_version' from source: facts 18714 1726853418.98203: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853418.98326: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853418.98545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853419.00835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853419.00923: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853419.00970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853419.01080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853419.01083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853419.01130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.01168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.01208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.01254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.01277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.01334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.01363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.01394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.01444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.01464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.01676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.01679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.01681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.01683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.01685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.01807: variable 'network_connections' from source: play vars 18714 1726853419.01825: variable 'interface' from source: set_fact 18714 1726853419.01905: variable 'interface' from source: set_fact 18714 1726853419.02023: variable 'interface' from source: set_fact 18714 1726853419.02026: variable 'interface' from source: set_fact 18714 1726853419.02064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853419.02603: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853419.02645: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853419.02688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853419.02720: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853419.02768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853419.02807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853419.02837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.02874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853419.02944: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853419.03206: variable 'network_connections' from source: play vars 18714 1726853419.03276: variable 'interface' from source: set_fact 18714 1726853419.03292: variable 'interface' from source: set_fact 18714 1726853419.03304: variable 'interface' from source: set_fact 18714 1726853419.03377: variable 'interface' from source: set_fact 18714 1726853419.03441: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853419.03444: when evaluation is False, skipping this task 18714 1726853419.03447: _execute() done 18714 1726853419.03449: dumping result to json 18714 1726853419.03454: done dumping result, returning 18714 1726853419.03456: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000024] 18714 1726853419.03501: sending task result for task 02083763-bbaf-e784-4f7d-000000000024 18714 1726853419.03768: done sending task result for task 02083763-bbaf-e784-4f7d-000000000024 18714 1726853419.03773: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853419.03820: no more pending results, returning what we have 18714 1726853419.03823: results queue empty 18714 1726853419.03825: checking for any_errors_fatal 18714 1726853419.03844: done checking for any_errors_fatal 18714 1726853419.03845: checking for max_fail_percentage 18714 1726853419.03848: done checking for max_fail_percentage 18714 1726853419.03848: checking to see if all hosts have failed and the running result is not ok 18714 1726853419.03849: done checking to see if all hosts have failed 18714 1726853419.03850: getting the remaining hosts for this loop 18714 1726853419.03855: done getting the remaining hosts for this loop 18714 1726853419.03859: getting the next task for host managed_node1 18714 1726853419.03867: done getting next task for host managed_node1 18714 1726853419.03873: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18714 1726853419.03875: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853419.03890: getting variables 18714 1726853419.03892: in VariableManager get_vars() 18714 1726853419.03938: Calling all_inventory to load vars for managed_node1 18714 1726853419.03941: Calling groups_inventory to load vars for managed_node1 18714 1726853419.03943: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853419.03956: Calling all_plugins_play to load vars for managed_node1 18714 1726853419.03960: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853419.03963: Calling groups_plugins_play to load vars for managed_node1 18714 1726853419.05786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853419.07327: done with get_vars() 18714 1726853419.07356: done getting variables 18714 1726853419.07417: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:30:19 -0400 (0:00:00.116) 0:00:15.457 ****** 18714 1726853419.07453: entering _queue_task() for managed_node1/service 18714 1726853419.07894: worker is 1 (out of 1 available) 18714 1726853419.07905: exiting _queue_task() for managed_node1/service 18714 1726853419.07914: done queuing things up, now waiting for results queue to drain 18714 1726853419.07915: waiting for pending results... 18714 1726853419.08094: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18714 1726853419.08210: in run() - task 02083763-bbaf-e784-4f7d-000000000025 18714 1726853419.08232: variable 'ansible_search_path' from source: unknown 18714 1726853419.08320: variable 'ansible_search_path' from source: unknown 18714 1726853419.08323: calling self._execute() 18714 1726853419.08397: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853419.08409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853419.08429: variable 'omit' from source: magic vars 18714 1726853419.08830: variable 'ansible_distribution_major_version' from source: facts 18714 1726853419.08847: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853419.09032: variable 'network_provider' from source: set_fact 18714 1726853419.09044: variable 'network_state' from source: role '' defaults 18714 1726853419.09062: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18714 1726853419.09077: variable 'omit' from source: magic vars 18714 1726853419.09117: variable 'omit' from source: magic vars 18714 1726853419.09150: variable 'network_service_name' from source: role '' defaults 18714 1726853419.09293: variable 'network_service_name' from source: role '' defaults 18714 1726853419.09344: variable '__network_provider_setup' from source: role '' defaults 18714 1726853419.09357: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853419.09423: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853419.09438: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853419.09509: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853419.09742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853419.11944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853419.12034: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853419.12081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853419.12376: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853419.12379: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853419.12382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.12385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.12386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.12389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.12390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.12411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.12438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.12468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.12518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.12536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.12786: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18714 1726853419.12910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.12944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.12977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.13020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.13047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.13144: variable 'ansible_python' from source: facts 18714 1726853419.13269: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18714 1726853419.13274: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853419.13354: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853419.13499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.13528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.13561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.13611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.13630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.13686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853419.13729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853419.13760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.13810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853419.13835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853419.14030: variable 'network_connections' from source: play vars 18714 1726853419.14034: variable 'interface' from source: set_fact 18714 1726853419.14079: variable 'interface' from source: set_fact 18714 1726853419.14097: variable 'interface' from source: set_fact 18714 1726853419.14183: variable 'interface' from source: set_fact 18714 1726853419.14306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853419.14512: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853419.14565: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853419.14678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853419.14682: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853419.14727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853419.14763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853419.14812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853419.14849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853419.14909: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853419.15208: variable 'network_connections' from source: play vars 18714 1726853419.15277: variable 'interface' from source: set_fact 18714 1726853419.15307: variable 'interface' from source: set_fact 18714 1726853419.15327: variable 'interface' from source: set_fact 18714 1726853419.15406: variable 'interface' from source: set_fact 18714 1726853419.15464: variable '__network_packages_default_wireless' from source: role '' defaults 18714 1726853419.15556: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853419.15853: variable 'network_connections' from source: play vars 18714 1726853419.15864: variable 'interface' from source: set_fact 18714 1726853419.15986: variable 'interface' from source: set_fact 18714 1726853419.15989: variable 'interface' from source: set_fact 18714 1726853419.16028: variable 'interface' from source: set_fact 18714 1726853419.16060: variable '__network_packages_default_team' from source: role '' defaults 18714 1726853419.16146: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853419.16464: variable 'network_connections' from source: play vars 18714 1726853419.16478: variable 'interface' from source: set_fact 18714 1726853419.16561: variable 'interface' from source: set_fact 18714 1726853419.16639: variable 'interface' from source: set_fact 18714 1726853419.16657: variable 'interface' from source: set_fact 18714 1726853419.16724: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853419.16799: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853419.16811: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853419.16883: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853419.17117: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18714 1726853419.17658: variable 'network_connections' from source: play vars 18714 1726853419.17669: variable 'interface' from source: set_fact 18714 1726853419.17739: variable 'interface' from source: set_fact 18714 1726853419.17749: variable 'interface' from source: set_fact 18714 1726853419.17814: variable 'interface' from source: set_fact 18714 1726853419.17840: variable 'ansible_distribution' from source: facts 18714 1726853419.17843: variable '__network_rh_distros' from source: role '' defaults 18714 1726853419.17950: variable 'ansible_distribution_major_version' from source: facts 18714 1726853419.17956: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18714 1726853419.18070: variable 'ansible_distribution' from source: facts 18714 1726853419.18081: variable '__network_rh_distros' from source: role '' defaults 18714 1726853419.18090: variable 'ansible_distribution_major_version' from source: facts 18714 1726853419.18107: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18714 1726853419.18286: variable 'ansible_distribution' from source: facts 18714 1726853419.18295: variable '__network_rh_distros' from source: role '' defaults 18714 1726853419.18304: variable 'ansible_distribution_major_version' from source: facts 18714 1726853419.18341: variable 'network_provider' from source: set_fact 18714 1726853419.18373: variable 'omit' from source: magic vars 18714 1726853419.18413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853419.18445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853419.18474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853419.18503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853419.18519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853419.18554: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853419.18604: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853419.18607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853419.18680: Set connection var ansible_shell_executable to /bin/sh 18714 1726853419.18691: Set connection var ansible_timeout to 10 18714 1726853419.18700: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853419.18717: Set connection var ansible_connection to ssh 18714 1726853419.18726: Set connection var ansible_shell_type to sh 18714 1726853419.18734: Set connection var ansible_pipelining to False 18714 1726853419.18822: variable 'ansible_shell_executable' from source: unknown 18714 1726853419.18825: variable 'ansible_connection' from source: unknown 18714 1726853419.18827: variable 'ansible_module_compression' from source: unknown 18714 1726853419.18829: variable 'ansible_shell_type' from source: unknown 18714 1726853419.18831: variable 'ansible_shell_executable' from source: unknown 18714 1726853419.18833: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853419.18839: variable 'ansible_pipelining' from source: unknown 18714 1726853419.18841: variable 'ansible_timeout' from source: unknown 18714 1726853419.18843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853419.18920: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853419.18943: variable 'omit' from source: magic vars 18714 1726853419.18956: starting attempt loop 18714 1726853419.18965: running the handler 18714 1726853419.19049: variable 'ansible_facts' from source: unknown 18714 1726853419.19993: _low_level_execute_command(): starting 18714 1726853419.19996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853419.21093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853419.21245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853419.21328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853419.21489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853419.23198: stdout chunk (state=3): >>>/root <<< 18714 1726853419.23327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853419.23337: stdout chunk (state=3): >>><<< 18714 1726853419.23354: stderr chunk (state=3): >>><<< 18714 1726853419.23385: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853419.23403: _low_level_execute_command(): starting 18714 1726853419.23412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995 `" && echo ansible-tmp-1726853419.2339122-19411-43150709734995="` echo /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995 `" ) && sleep 0' 18714 1726853419.24022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853419.24039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853419.24057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853419.24139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853419.24145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853419.24200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853419.24220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853419.24359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853419.26218: stdout chunk (state=3): >>>ansible-tmp-1726853419.2339122-19411-43150709734995=/root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995 <<< 18714 1726853419.26380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853419.26383: stdout chunk (state=3): >>><<< 18714 1726853419.26385: stderr chunk (state=3): >>><<< 18714 1726853419.26576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853419.2339122-19411-43150709734995=/root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853419.26580: variable 'ansible_module_compression' from source: unknown 18714 1726853419.26583: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 18714 1726853419.26586: ANSIBALLZ: Acquiring lock 18714 1726853419.26588: ANSIBALLZ: Lock acquired: 139791971422656 18714 1726853419.26590: ANSIBALLZ: Creating module 18714 1726853419.62648: ANSIBALLZ: Writing module into payload 18714 1726853419.62826: ANSIBALLZ: Writing module 18714 1726853419.62851: ANSIBALLZ: Renaming module 18714 1726853419.62859: ANSIBALLZ: Done creating module 18714 1726853419.62908: variable 'ansible_facts' from source: unknown 18714 1726853419.63148: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py 18714 1726853419.63293: Sending initial data 18714 1726853419.63296: Sent initial data (155 bytes) 18714 1726853419.64177: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853419.64180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853419.64183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853419.64232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853419.64464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853419.66027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853419.66072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853419.66117: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp3fdq7zia /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py <<< 18714 1726853419.66121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py" <<< 18714 1726853419.66162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp3fdq7zia" to remote "/root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py" <<< 18714 1726853419.69063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853419.69067: stdout chunk (state=3): >>><<< 18714 1726853419.69076: stderr chunk (state=3): >>><<< 18714 1726853419.69131: done transferring module to remote 18714 1726853419.69172: _low_level_execute_command(): starting 18714 1726853419.69177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/ /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py && sleep 0' 18714 1726853419.70532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853419.70634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853419.70702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853419.70730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853419.70795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853419.72631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853419.72634: stdout chunk (state=3): >>><<< 18714 1726853419.72636: stderr chunk (state=3): >>><<< 18714 1726853419.72638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853419.72641: _low_level_execute_command(): starting 18714 1726853419.72644: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/AnsiballZ_systemd.py && sleep 0' 18714 1726853419.73950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853419.73986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853419.74117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853419.74157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853419.74189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853419.74249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853420.03414: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321823232", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "852508000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 18714 1726853420.03461: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18714 1726853420.05774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853420.05778: stdout chunk (state=3): >>><<< 18714 1726853420.05780: stderr chunk (state=3): >>><<< 18714 1726853420.05783: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321823232", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "852508000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853420.05924: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853420.05942: _low_level_execute_command(): starting 18714 1726853420.05947: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853419.2339122-19411-43150709734995/ > /dev/null 2>&1 && sleep 0' 18714 1726853420.07122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853420.07128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853420.07287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853420.07298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853420.07370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853420.09243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853420.09246: stderr chunk (state=3): >>><<< 18714 1726853420.09281: stdout chunk (state=3): >>><<< 18714 1726853420.09284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853420.09287: handler run complete 18714 1726853420.09479: attempt loop complete, returning result 18714 1726853420.09482: _execute() done 18714 1726853420.09484: dumping result to json 18714 1726853420.09486: done dumping result, returning 18714 1726853420.09488: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-e784-4f7d-000000000025] 18714 1726853420.09490: sending task result for task 02083763-bbaf-e784-4f7d-000000000025 18714 1726853420.11606: done sending task result for task 02083763-bbaf-e784-4f7d-000000000025 18714 1726853420.11609: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853420.11665: no more pending results, returning what we have 18714 1726853420.11668: results queue empty 18714 1726853420.11669: checking for any_errors_fatal 18714 1726853420.11678: done checking for any_errors_fatal 18714 1726853420.11679: checking for max_fail_percentage 18714 1726853420.11681: done checking for max_fail_percentage 18714 1726853420.11682: checking to see if all hosts have failed and the running result is not ok 18714 1726853420.11683: done checking to see if all hosts have failed 18714 1726853420.11683: getting the remaining hosts for this loop 18714 1726853420.11685: done getting the remaining hosts for this loop 18714 1726853420.11688: getting the next task for host managed_node1 18714 1726853420.11695: done getting next task for host managed_node1 18714 1726853420.11699: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18714 1726853420.11701: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853420.11710: getting variables 18714 1726853420.11712: in VariableManager get_vars() 18714 1726853420.11748: Calling all_inventory to load vars for managed_node1 18714 1726853420.11753: Calling groups_inventory to load vars for managed_node1 18714 1726853420.11756: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853420.11767: Calling all_plugins_play to load vars for managed_node1 18714 1726853420.11770: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853420.12291: Calling groups_plugins_play to load vars for managed_node1 18714 1726853420.14678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853420.16449: done with get_vars() 18714 1726853420.16477: done getting variables 18714 1726853420.16545: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:30:20 -0400 (0:00:01.091) 0:00:16.549 ****** 18714 1726853420.16584: entering _queue_task() for managed_node1/service 18714 1726853420.16933: worker is 1 (out of 1 available) 18714 1726853420.16946: exiting _queue_task() for managed_node1/service 18714 1726853420.17081: done queuing things up, now waiting for results queue to drain 18714 1726853420.17082: waiting for pending results... 18714 1726853420.17501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18714 1726853420.17506: in run() - task 02083763-bbaf-e784-4f7d-000000000026 18714 1726853420.17509: variable 'ansible_search_path' from source: unknown 18714 1726853420.17511: variable 'ansible_search_path' from source: unknown 18714 1726853420.17513: calling self._execute() 18714 1726853420.17576: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853420.17580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853420.17582: variable 'omit' from source: magic vars 18714 1726853420.18035: variable 'ansible_distribution_major_version' from source: facts 18714 1726853420.18039: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853420.18119: variable 'network_provider' from source: set_fact 18714 1726853420.18128: Evaluated conditional (network_provider == "nm"): True 18714 1726853420.18224: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853420.18319: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853420.18505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853420.20982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853420.21090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853420.21101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853420.21138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853420.21173: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853420.21264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853420.21295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853420.21376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853420.21379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853420.21382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853420.21438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853420.21778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853420.21783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853420.21786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853420.21788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853420.21815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853420.21842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853420.21915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853420.22076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853420.22079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853420.22353: variable 'network_connections' from source: play vars 18714 1726853420.22390: variable 'interface' from source: set_fact 18714 1726853420.22575: variable 'interface' from source: set_fact 18714 1726853420.22589: variable 'interface' from source: set_fact 18714 1726853420.22711: variable 'interface' from source: set_fact 18714 1726853420.22942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853420.23317: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853420.23359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853420.23433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853420.23523: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853420.23575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853420.23608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853420.23647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853420.23683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853420.23746: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853420.24014: variable 'network_connections' from source: play vars 18714 1726853420.24030: variable 'interface' from source: set_fact 18714 1726853420.24105: variable 'interface' from source: set_fact 18714 1726853420.24116: variable 'interface' from source: set_fact 18714 1726853420.24190: variable 'interface' from source: set_fact 18714 1726853420.24235: Evaluated conditional (__network_wpa_supplicant_required): False 18714 1726853420.24246: when evaluation is False, skipping this task 18714 1726853420.24257: _execute() done 18714 1726853420.24281: dumping result to json 18714 1726853420.24289: done dumping result, returning 18714 1726853420.24301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-e784-4f7d-000000000026] 18714 1726853420.24310: sending task result for task 02083763-bbaf-e784-4f7d-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18714 1726853420.24573: no more pending results, returning what we have 18714 1726853420.24577: results queue empty 18714 1726853420.24578: checking for any_errors_fatal 18714 1726853420.24607: done checking for any_errors_fatal 18714 1726853420.24608: checking for max_fail_percentage 18714 1726853420.24611: done checking for max_fail_percentage 18714 1726853420.24612: checking to see if all hosts have failed and the running result is not ok 18714 1726853420.24612: done checking to see if all hosts have failed 18714 1726853420.24613: getting the remaining hosts for this loop 18714 1726853420.24615: done getting the remaining hosts for this loop 18714 1726853420.24619: getting the next task for host managed_node1 18714 1726853420.24625: done getting next task for host managed_node1 18714 1726853420.24630: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18714 1726853420.24632: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853420.24646: getting variables 18714 1726853420.24648: in VariableManager get_vars() 18714 1726853420.24694: Calling all_inventory to load vars for managed_node1 18714 1726853420.24698: Calling groups_inventory to load vars for managed_node1 18714 1726853420.24701: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853420.24832: Calling all_plugins_play to load vars for managed_node1 18714 1726853420.24836: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853420.24840: Calling groups_plugins_play to load vars for managed_node1 18714 1726853420.25648: done sending task result for task 02083763-bbaf-e784-4f7d-000000000026 18714 1726853420.25654: WORKER PROCESS EXITING 18714 1726853420.26668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853420.28281: done with get_vars() 18714 1726853420.28312: done getting variables 18714 1726853420.28366: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:30:20 -0400 (0:00:00.118) 0:00:16.667 ****** 18714 1726853420.28397: entering _queue_task() for managed_node1/service 18714 1726853420.28856: worker is 1 (out of 1 available) 18714 1726853420.28868: exiting _queue_task() for managed_node1/service 18714 1726853420.28881: done queuing things up, now waiting for results queue to drain 18714 1726853420.28882: waiting for pending results... 18714 1726853420.29067: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18714 1726853420.29195: in run() - task 02083763-bbaf-e784-4f7d-000000000027 18714 1726853420.29221: variable 'ansible_search_path' from source: unknown 18714 1726853420.29229: variable 'ansible_search_path' from source: unknown 18714 1726853420.29276: calling self._execute() 18714 1726853420.29380: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853420.29405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853420.29412: variable 'omit' from source: magic vars 18714 1726853420.29840: variable 'ansible_distribution_major_version' from source: facts 18714 1726853420.29843: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853420.29966: variable 'network_provider' from source: set_fact 18714 1726853420.29983: Evaluated conditional (network_provider == "initscripts"): False 18714 1726853420.29991: when evaluation is False, skipping this task 18714 1726853420.30062: _execute() done 18714 1726853420.30065: dumping result to json 18714 1726853420.30068: done dumping result, returning 18714 1726853420.30073: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-e784-4f7d-000000000027] 18714 1726853420.30075: sending task result for task 02083763-bbaf-e784-4f7d-000000000027 18714 1726853420.30142: done sending task result for task 02083763-bbaf-e784-4f7d-000000000027 18714 1726853420.30145: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853420.30198: no more pending results, returning what we have 18714 1726853420.30202: results queue empty 18714 1726853420.30204: checking for any_errors_fatal 18714 1726853420.30212: done checking for any_errors_fatal 18714 1726853420.30213: checking for max_fail_percentage 18714 1726853420.30215: done checking for max_fail_percentage 18714 1726853420.30216: checking to see if all hosts have failed and the running result is not ok 18714 1726853420.30217: done checking to see if all hosts have failed 18714 1726853420.30217: getting the remaining hosts for this loop 18714 1726853420.30219: done getting the remaining hosts for this loop 18714 1726853420.30222: getting the next task for host managed_node1 18714 1726853420.30230: done getting next task for host managed_node1 18714 1726853420.30234: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18714 1726853420.30236: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853420.30250: getting variables 18714 1726853420.30255: in VariableManager get_vars() 18714 1726853420.30296: Calling all_inventory to load vars for managed_node1 18714 1726853420.30299: Calling groups_inventory to load vars for managed_node1 18714 1726853420.30302: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853420.30315: Calling all_plugins_play to load vars for managed_node1 18714 1726853420.30318: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853420.30321: Calling groups_plugins_play to load vars for managed_node1 18714 1726853420.33172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853420.35063: done with get_vars() 18714 1726853420.35094: done getting variables 18714 1726853420.35161: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:30:20 -0400 (0:00:00.067) 0:00:16.735 ****** 18714 1726853420.35199: entering _queue_task() for managed_node1/copy 18714 1726853420.35580: worker is 1 (out of 1 available) 18714 1726853420.35593: exiting _queue_task() for managed_node1/copy 18714 1726853420.35604: done queuing things up, now waiting for results queue to drain 18714 1726853420.35605: waiting for pending results... 18714 1726853420.35977: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18714 1726853420.36069: in run() - task 02083763-bbaf-e784-4f7d-000000000028 18714 1726853420.36075: variable 'ansible_search_path' from source: unknown 18714 1726853420.36077: variable 'ansible_search_path' from source: unknown 18714 1726853420.36082: calling self._execute() 18714 1726853420.36184: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853420.36201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853420.36216: variable 'omit' from source: magic vars 18714 1726853420.36660: variable 'ansible_distribution_major_version' from source: facts 18714 1726853420.36679: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853420.36832: variable 'network_provider' from source: set_fact 18714 1726853420.36835: Evaluated conditional (network_provider == "initscripts"): False 18714 1726853420.36839: when evaluation is False, skipping this task 18714 1726853420.36846: _execute() done 18714 1726853420.36848: dumping result to json 18714 1726853420.36853: done dumping result, returning 18714 1726853420.36942: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-e784-4f7d-000000000028] 18714 1726853420.36945: sending task result for task 02083763-bbaf-e784-4f7d-000000000028 18714 1726853420.37018: done sending task result for task 02083763-bbaf-e784-4f7d-000000000028 18714 1726853420.37020: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18714 1726853420.37088: no more pending results, returning what we have 18714 1726853420.37094: results queue empty 18714 1726853420.37095: checking for any_errors_fatal 18714 1726853420.37101: done checking for any_errors_fatal 18714 1726853420.37102: checking for max_fail_percentage 18714 1726853420.37104: done checking for max_fail_percentage 18714 1726853420.37105: checking to see if all hosts have failed and the running result is not ok 18714 1726853420.37106: done checking to see if all hosts have failed 18714 1726853420.37106: getting the remaining hosts for this loop 18714 1726853420.37108: done getting the remaining hosts for this loop 18714 1726853420.37111: getting the next task for host managed_node1 18714 1726853420.37117: done getting next task for host managed_node1 18714 1726853420.37121: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18714 1726853420.37123: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853420.37137: getting variables 18714 1726853420.37139: in VariableManager get_vars() 18714 1726853420.37179: Calling all_inventory to load vars for managed_node1 18714 1726853420.37182: Calling groups_inventory to load vars for managed_node1 18714 1726853420.37185: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853420.37197: Calling all_plugins_play to load vars for managed_node1 18714 1726853420.37200: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853420.37203: Calling groups_plugins_play to load vars for managed_node1 18714 1726853420.38815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853420.41056: done with get_vars() 18714 1726853420.41126: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:30:20 -0400 (0:00:00.060) 0:00:16.795 ****** 18714 1726853420.41209: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18714 1726853420.41211: Creating lock for fedora.linux_system_roles.network_connections 18714 1726853420.42072: worker is 1 (out of 1 available) 18714 1726853420.42085: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18714 1726853420.42101: done queuing things up, now waiting for results queue to drain 18714 1726853420.42102: waiting for pending results... 18714 1726853420.42547: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18714 1726853420.42556: in run() - task 02083763-bbaf-e784-4f7d-000000000029 18714 1726853420.42560: variable 'ansible_search_path' from source: unknown 18714 1726853420.42563: variable 'ansible_search_path' from source: unknown 18714 1726853420.42821: calling self._execute() 18714 1726853420.42897: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853420.42901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853420.42929: variable 'omit' from source: magic vars 18714 1726853420.43818: variable 'ansible_distribution_major_version' from source: facts 18714 1726853420.43877: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853420.43881: variable 'omit' from source: magic vars 18714 1726853420.43994: variable 'omit' from source: magic vars 18714 1726853420.44256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853420.48263: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853420.48267: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853420.48293: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853420.48368: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853420.48378: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853420.48441: variable 'network_provider' from source: set_fact 18714 1726853420.48582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853420.48957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853420.48982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853420.49029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853420.49037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853420.49121: variable 'omit' from source: magic vars 18714 1726853420.49245: variable 'omit' from source: magic vars 18714 1726853420.49411: variable 'network_connections' from source: play vars 18714 1726853420.49414: variable 'interface' from source: set_fact 18714 1726853420.49526: variable 'interface' from source: set_fact 18714 1726853420.49573: variable 'interface' from source: set_fact 18714 1726853420.49594: variable 'interface' from source: set_fact 18714 1726853420.49976: variable 'omit' from source: magic vars 18714 1726853420.49979: variable '__lsr_ansible_managed' from source: task vars 18714 1726853420.49981: variable '__lsr_ansible_managed' from source: task vars 18714 1726853420.50277: Loaded config def from plugin (lookup/template) 18714 1726853420.50281: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18714 1726853420.50283: File lookup term: get_ansible_managed.j2 18714 1726853420.50286: variable 'ansible_search_path' from source: unknown 18714 1726853420.50336: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18714 1726853420.50341: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18714 1726853420.50344: variable 'ansible_search_path' from source: unknown 18714 1726853420.57648: variable 'ansible_managed' from source: unknown 18714 1726853420.57776: variable 'omit' from source: magic vars 18714 1726853420.57809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853420.57836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853420.57852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853420.57866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853420.57876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853420.57992: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853420.57996: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853420.58009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853420.58042: Set connection var ansible_shell_executable to /bin/sh 18714 1726853420.58045: Set connection var ansible_timeout to 10 18714 1726853420.58047: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853420.58078: Set connection var ansible_connection to ssh 18714 1726853420.58081: Set connection var ansible_shell_type to sh 18714 1726853420.58083: Set connection var ansible_pipelining to False 18714 1726853420.58176: variable 'ansible_shell_executable' from source: unknown 18714 1726853420.58179: variable 'ansible_connection' from source: unknown 18714 1726853420.58181: variable 'ansible_module_compression' from source: unknown 18714 1726853420.58184: variable 'ansible_shell_type' from source: unknown 18714 1726853420.58186: variable 'ansible_shell_executable' from source: unknown 18714 1726853420.58188: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853420.58190: variable 'ansible_pipelining' from source: unknown 18714 1726853420.58192: variable 'ansible_timeout' from source: unknown 18714 1726853420.58194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853420.58443: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853420.58457: variable 'omit' from source: magic vars 18714 1726853420.58459: starting attempt loop 18714 1726853420.58461: running the handler 18714 1726853420.58463: _low_level_execute_command(): starting 18714 1726853420.58465: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853420.58985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853420.59002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853420.59014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853420.59027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853420.59041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853420.59048: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853420.59058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853420.59074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853420.59105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853420.59159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853420.59201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853420.59232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853420.59429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853420.61178: stdout chunk (state=3): >>>/root <<< 18714 1726853420.61295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853420.61302: stdout chunk (state=3): >>><<< 18714 1726853420.61310: stderr chunk (state=3): >>><<< 18714 1726853420.61443: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853420.61458: _low_level_execute_command(): starting 18714 1726853420.61464: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960 `" && echo ansible-tmp-1726853420.6144445-19470-21055119954960="` echo /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960 `" ) && sleep 0' 18714 1726853420.62145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853420.62148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853420.62154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853420.62157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853420.62214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853420.62234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853420.62307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853420.64233: stdout chunk (state=3): >>>ansible-tmp-1726853420.6144445-19470-21055119954960=/root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960 <<< 18714 1726853420.64335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853420.64339: stdout chunk (state=3): >>><<< 18714 1726853420.64365: stderr chunk (state=3): >>><<< 18714 1726853420.64368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853420.6144445-19470-21055119954960=/root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853420.64408: variable 'ansible_module_compression' from source: unknown 18714 1726853420.64468: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 18714 1726853420.64474: ANSIBALLZ: Acquiring lock 18714 1726853420.64477: ANSIBALLZ: Lock acquired: 139791965545504 18714 1726853420.64479: ANSIBALLZ: Creating module 18714 1726853420.85885: ANSIBALLZ: Writing module into payload 18714 1726853420.86135: ANSIBALLZ: Writing module 18714 1726853420.86156: ANSIBALLZ: Renaming module 18714 1726853420.86162: ANSIBALLZ: Done creating module 18714 1726853420.86191: variable 'ansible_facts' from source: unknown 18714 1726853420.86298: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py 18714 1726853420.86424: Sending initial data 18714 1726853420.86427: Sent initial data (167 bytes) 18714 1726853420.87088: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853420.87162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853420.87192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853420.87306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853420.88956: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853420.89020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853420.89066: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp3edqw_c5 /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py <<< 18714 1726853420.89091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py" <<< 18714 1726853420.89133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp3edqw_c5" to remote "/root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py" <<< 18714 1726853420.90276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853420.90279: stdout chunk (state=3): >>><<< 18714 1726853420.90281: stderr chunk (state=3): >>><<< 18714 1726853420.90283: done transferring module to remote 18714 1726853420.90297: _low_level_execute_command(): starting 18714 1726853420.90304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/ /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py && sleep 0' 18714 1726853420.90987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853420.91004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853420.91090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853420.91114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853420.91155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853420.91191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853420.92987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853420.92991: stdout chunk (state=3): >>><<< 18714 1726853420.92994: stderr chunk (state=3): >>><<< 18714 1726853420.93011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853420.93092: _low_level_execute_command(): starting 18714 1726853420.93096: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/AnsiballZ_network_connections.py && sleep 0' 18714 1726853420.93618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853420.93635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853420.93650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853420.93669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853420.93691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853420.93784: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853420.93815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853420.93888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853421.39754: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18714 1726853421.41979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853421.41983: stdout chunk (state=3): >>><<< 18714 1726853421.41986: stderr chunk (state=3): >>><<< 18714 1726853421.41990: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853421.42180: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853421.42183: _low_level_execute_command(): starting 18714 1726853421.42186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853420.6144445-19470-21055119954960/ > /dev/null 2>&1 && sleep 0' 18714 1726853421.43564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853421.43676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853421.43693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853421.43762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853421.45662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853421.45709: stderr chunk (state=3): >>><<< 18714 1726853421.45788: stdout chunk (state=3): >>><<< 18714 1726853421.45807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853421.45813: handler run complete 18714 1726853421.45963: attempt loop complete, returning result 18714 1726853421.45967: _execute() done 18714 1726853421.45969: dumping result to json 18714 1726853421.45977: done dumping result, returning 18714 1726853421.45985: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-e784-4f7d-000000000029] 18714 1726853421.45988: sending task result for task 02083763-bbaf-e784-4f7d-000000000029 18714 1726853421.46128: done sending task result for task 02083763-bbaf-e784-4f7d-000000000029 18714 1726853421.46132: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 (not-active) 18714 1726853421.46283: no more pending results, returning what we have 18714 1726853421.46286: results queue empty 18714 1726853421.46288: checking for any_errors_fatal 18714 1726853421.46296: done checking for any_errors_fatal 18714 1726853421.46297: checking for max_fail_percentage 18714 1726853421.46299: done checking for max_fail_percentage 18714 1726853421.46300: checking to see if all hosts have failed and the running result is not ok 18714 1726853421.46301: done checking to see if all hosts have failed 18714 1726853421.46302: getting the remaining hosts for this loop 18714 1726853421.46303: done getting the remaining hosts for this loop 18714 1726853421.46306: getting the next task for host managed_node1 18714 1726853421.46313: done getting next task for host managed_node1 18714 1726853421.46317: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18714 1726853421.46319: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853421.46329: getting variables 18714 1726853421.46330: in VariableManager get_vars() 18714 1726853421.46676: Calling all_inventory to load vars for managed_node1 18714 1726853421.46681: Calling groups_inventory to load vars for managed_node1 18714 1726853421.46684: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853421.46694: Calling all_plugins_play to load vars for managed_node1 18714 1726853421.46697: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853421.46700: Calling groups_plugins_play to load vars for managed_node1 18714 1726853421.48960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853421.51560: done with get_vars() 18714 1726853421.51592: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:30:21 -0400 (0:00:01.105) 0:00:17.901 ****** 18714 1726853421.51801: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18714 1726853421.51803: Creating lock for fedora.linux_system_roles.network_state 18714 1726853421.52609: worker is 1 (out of 1 available) 18714 1726853421.52624: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18714 1726853421.52638: done queuing things up, now waiting for results queue to drain 18714 1726853421.52639: waiting for pending results... 18714 1726853421.53293: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18714 1726853421.53298: in run() - task 02083763-bbaf-e784-4f7d-00000000002a 18714 1726853421.53306: variable 'ansible_search_path' from source: unknown 18714 1726853421.53313: variable 'ansible_search_path' from source: unknown 18714 1726853421.53354: calling self._execute() 18714 1726853421.53678: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.53682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.53685: variable 'omit' from source: magic vars 18714 1726853421.54342: variable 'ansible_distribution_major_version' from source: facts 18714 1726853421.54358: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853421.54667: variable 'network_state' from source: role '' defaults 18714 1726853421.54684: Evaluated conditional (network_state != {}): False 18714 1726853421.54692: when evaluation is False, skipping this task 18714 1726853421.54699: _execute() done 18714 1726853421.54707: dumping result to json 18714 1726853421.54714: done dumping result, returning 18714 1726853421.54724: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-e784-4f7d-00000000002a] 18714 1726853421.54733: sending task result for task 02083763-bbaf-e784-4f7d-00000000002a 18714 1726853421.55047: done sending task result for task 02083763-bbaf-e784-4f7d-00000000002a 18714 1726853421.55050: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853421.55103: no more pending results, returning what we have 18714 1726853421.55110: results queue empty 18714 1726853421.55112: checking for any_errors_fatal 18714 1726853421.55124: done checking for any_errors_fatal 18714 1726853421.55125: checking for max_fail_percentage 18714 1726853421.55126: done checking for max_fail_percentage 18714 1726853421.55127: checking to see if all hosts have failed and the running result is not ok 18714 1726853421.55128: done checking to see if all hosts have failed 18714 1726853421.55129: getting the remaining hosts for this loop 18714 1726853421.55130: done getting the remaining hosts for this loop 18714 1726853421.55133: getting the next task for host managed_node1 18714 1726853421.55140: done getting next task for host managed_node1 18714 1726853421.55143: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18714 1726853421.55146: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853421.55162: getting variables 18714 1726853421.55164: in VariableManager get_vars() 18714 1726853421.55348: Calling all_inventory to load vars for managed_node1 18714 1726853421.55353: Calling groups_inventory to load vars for managed_node1 18714 1726853421.55356: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853421.55364: Calling all_plugins_play to load vars for managed_node1 18714 1726853421.55366: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853421.55369: Calling groups_plugins_play to load vars for managed_node1 18714 1726853421.58037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853421.61396: done with get_vars() 18714 1726853421.61544: done getting variables 18714 1726853421.61612: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:30:21 -0400 (0:00:00.098) 0:00:17.999 ****** 18714 1726853421.61646: entering _queue_task() for managed_node1/debug 18714 1726853421.62308: worker is 1 (out of 1 available) 18714 1726853421.62413: exiting _queue_task() for managed_node1/debug 18714 1726853421.62428: done queuing things up, now waiting for results queue to drain 18714 1726853421.62429: waiting for pending results... 18714 1726853421.62619: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18714 1726853421.62719: in run() - task 02083763-bbaf-e784-4f7d-00000000002b 18714 1726853421.62742: variable 'ansible_search_path' from source: unknown 18714 1726853421.62746: variable 'ansible_search_path' from source: unknown 18714 1726853421.62790: calling self._execute() 18714 1726853421.63076: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.63080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.63083: variable 'omit' from source: magic vars 18714 1726853421.63521: variable 'ansible_distribution_major_version' from source: facts 18714 1726853421.63525: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853421.63528: variable 'omit' from source: magic vars 18714 1726853421.63530: variable 'omit' from source: magic vars 18714 1726853421.63577: variable 'omit' from source: magic vars 18714 1726853421.63581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853421.63584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853421.63605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853421.63627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853421.63638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853421.63682: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853421.63686: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.63688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.63815: Set connection var ansible_shell_executable to /bin/sh 18714 1726853421.63821: Set connection var ansible_timeout to 10 18714 1726853421.63827: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853421.63840: Set connection var ansible_connection to ssh 18714 1726853421.63846: Set connection var ansible_shell_type to sh 18714 1726853421.63851: Set connection var ansible_pipelining to False 18714 1726853421.63884: variable 'ansible_shell_executable' from source: unknown 18714 1726853421.63887: variable 'ansible_connection' from source: unknown 18714 1726853421.63890: variable 'ansible_module_compression' from source: unknown 18714 1726853421.63892: variable 'ansible_shell_type' from source: unknown 18714 1726853421.63895: variable 'ansible_shell_executable' from source: unknown 18714 1726853421.63897: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.63899: variable 'ansible_pipelining' from source: unknown 18714 1726853421.63903: variable 'ansible_timeout' from source: unknown 18714 1726853421.63907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.64176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853421.64180: variable 'omit' from source: magic vars 18714 1726853421.64183: starting attempt loop 18714 1726853421.64186: running the handler 18714 1726853421.64232: variable '__network_connections_result' from source: set_fact 18714 1726853421.64301: handler run complete 18714 1726853421.64320: attempt loop complete, returning result 18714 1726853421.64324: _execute() done 18714 1726853421.64326: dumping result to json 18714 1726853421.64329: done dumping result, returning 18714 1726853421.64338: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-e784-4f7d-00000000002b] 18714 1726853421.64341: sending task result for task 02083763-bbaf-e784-4f7d-00000000002b 18714 1726853421.64587: done sending task result for task 02083763-bbaf-e784-4f7d-00000000002b 18714 1726853421.64591: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 (not-active)" ] } 18714 1726853421.64664: no more pending results, returning what we have 18714 1726853421.64667: results queue empty 18714 1726853421.64668: checking for any_errors_fatal 18714 1726853421.64676: done checking for any_errors_fatal 18714 1726853421.64677: checking for max_fail_percentage 18714 1726853421.64680: done checking for max_fail_percentage 18714 1726853421.64680: checking to see if all hosts have failed and the running result is not ok 18714 1726853421.64681: done checking to see if all hosts have failed 18714 1726853421.64682: getting the remaining hosts for this loop 18714 1726853421.64683: done getting the remaining hosts for this loop 18714 1726853421.64687: getting the next task for host managed_node1 18714 1726853421.64693: done getting next task for host managed_node1 18714 1726853421.64696: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18714 1726853421.64698: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853421.64781: getting variables 18714 1726853421.64783: in VariableManager get_vars() 18714 1726853421.64934: Calling all_inventory to load vars for managed_node1 18714 1726853421.64937: Calling groups_inventory to load vars for managed_node1 18714 1726853421.64940: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853421.64949: Calling all_plugins_play to load vars for managed_node1 18714 1726853421.64954: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853421.64958: Calling groups_plugins_play to load vars for managed_node1 18714 1726853421.68091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853421.70087: done with get_vars() 18714 1726853421.70110: done getting variables 18714 1726853421.70169: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:30:21 -0400 (0:00:00.087) 0:00:18.087 ****** 18714 1726853421.70404: entering _queue_task() for managed_node1/debug 18714 1726853421.70908: worker is 1 (out of 1 available) 18714 1726853421.70919: exiting _queue_task() for managed_node1/debug 18714 1726853421.70930: done queuing things up, now waiting for results queue to drain 18714 1726853421.70931: waiting for pending results... 18714 1726853421.71276: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18714 1726853421.71516: in run() - task 02083763-bbaf-e784-4f7d-00000000002c 18714 1726853421.71519: variable 'ansible_search_path' from source: unknown 18714 1726853421.71522: variable 'ansible_search_path' from source: unknown 18714 1726853421.71668: calling self._execute() 18714 1726853421.71815: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.71819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.71832: variable 'omit' from source: magic vars 18714 1726853421.72646: variable 'ansible_distribution_major_version' from source: facts 18714 1726853421.72660: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853421.72666: variable 'omit' from source: magic vars 18714 1726853421.72851: variable 'omit' from source: magic vars 18714 1726853421.72928: variable 'omit' from source: magic vars 18714 1726853421.72931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853421.73007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853421.73024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853421.73161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853421.73174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853421.73202: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853421.73205: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.73207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.73472: Set connection var ansible_shell_executable to /bin/sh 18714 1726853421.73476: Set connection var ansible_timeout to 10 18714 1726853421.73478: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853421.73480: Set connection var ansible_connection to ssh 18714 1726853421.73482: Set connection var ansible_shell_type to sh 18714 1726853421.73484: Set connection var ansible_pipelining to False 18714 1726853421.73486: variable 'ansible_shell_executable' from source: unknown 18714 1726853421.73488: variable 'ansible_connection' from source: unknown 18714 1726853421.73490: variable 'ansible_module_compression' from source: unknown 18714 1726853421.73492: variable 'ansible_shell_type' from source: unknown 18714 1726853421.73583: variable 'ansible_shell_executable' from source: unknown 18714 1726853421.73586: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.73592: variable 'ansible_pipelining' from source: unknown 18714 1726853421.73594: variable 'ansible_timeout' from source: unknown 18714 1726853421.73598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.73920: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853421.73924: variable 'omit' from source: magic vars 18714 1726853421.73926: starting attempt loop 18714 1726853421.73928: running the handler 18714 1726853421.74053: variable '__network_connections_result' from source: set_fact 18714 1726853421.74192: variable '__network_connections_result' from source: set_fact 18714 1726853421.74425: handler run complete 18714 1726853421.74568: attempt loop complete, returning result 18714 1726853421.74572: _execute() done 18714 1726853421.74575: dumping result to json 18714 1726853421.74578: done dumping result, returning 18714 1726853421.74596: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-e784-4f7d-00000000002c] 18714 1726853421.74779: sending task result for task 02083763-bbaf-e784-4f7d-00000000002c 18714 1726853421.74841: done sending task result for task 02083763-bbaf-e784-4f7d-00000000002c 18714 1726853421.74843: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, bd03661e-f09e-4a4d-b5cf-80038b20f631 (not-active)" ] } } 18714 1726853421.74926: no more pending results, returning what we have 18714 1726853421.74929: results queue empty 18714 1726853421.74930: checking for any_errors_fatal 18714 1726853421.74936: done checking for any_errors_fatal 18714 1726853421.74937: checking for max_fail_percentage 18714 1726853421.74939: done checking for max_fail_percentage 18714 1726853421.74940: checking to see if all hosts have failed and the running result is not ok 18714 1726853421.74940: done checking to see if all hosts have failed 18714 1726853421.74941: getting the remaining hosts for this loop 18714 1726853421.74942: done getting the remaining hosts for this loop 18714 1726853421.74946: getting the next task for host managed_node1 18714 1726853421.74954: done getting next task for host managed_node1 18714 1726853421.74958: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18714 1726853421.74960: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853421.74968: getting variables 18714 1726853421.74970: in VariableManager get_vars() 18714 1726853421.75005: Calling all_inventory to load vars for managed_node1 18714 1726853421.75008: Calling groups_inventory to load vars for managed_node1 18714 1726853421.75010: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853421.75019: Calling all_plugins_play to load vars for managed_node1 18714 1726853421.75022: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853421.75025: Calling groups_plugins_play to load vars for managed_node1 18714 1726853421.76533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853421.79229: done with get_vars() 18714 1726853421.79252: done getting variables 18714 1726853421.79518: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:30:21 -0400 (0:00:00.091) 0:00:18.178 ****** 18714 1726853421.79552: entering _queue_task() for managed_node1/debug 18714 1726853421.80313: worker is 1 (out of 1 available) 18714 1726853421.80324: exiting _queue_task() for managed_node1/debug 18714 1726853421.80335: done queuing things up, now waiting for results queue to drain 18714 1726853421.80336: waiting for pending results... 18714 1726853421.80694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18714 1726853421.80979: in run() - task 02083763-bbaf-e784-4f7d-00000000002d 18714 1726853421.80983: variable 'ansible_search_path' from source: unknown 18714 1726853421.80985: variable 'ansible_search_path' from source: unknown 18714 1726853421.81021: calling self._execute() 18714 1726853421.81192: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.81196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.81207: variable 'omit' from source: magic vars 18714 1726853421.82225: variable 'ansible_distribution_major_version' from source: facts 18714 1726853421.82237: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853421.82406: variable 'network_state' from source: role '' defaults 18714 1726853421.82525: Evaluated conditional (network_state != {}): False 18714 1726853421.82529: when evaluation is False, skipping this task 18714 1726853421.82532: _execute() done 18714 1726853421.82535: dumping result to json 18714 1726853421.82537: done dumping result, returning 18714 1726853421.82568: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-e784-4f7d-00000000002d] 18714 1726853421.82573: sending task result for task 02083763-bbaf-e784-4f7d-00000000002d skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18714 1726853421.83024: no more pending results, returning what we have 18714 1726853421.83029: results queue empty 18714 1726853421.83030: checking for any_errors_fatal 18714 1726853421.83041: done checking for any_errors_fatal 18714 1726853421.83042: checking for max_fail_percentage 18714 1726853421.83045: done checking for max_fail_percentage 18714 1726853421.83046: checking to see if all hosts have failed and the running result is not ok 18714 1726853421.83047: done checking to see if all hosts have failed 18714 1726853421.83048: getting the remaining hosts for this loop 18714 1726853421.83049: done getting the remaining hosts for this loop 18714 1726853421.83054: getting the next task for host managed_node1 18714 1726853421.83062: done getting next task for host managed_node1 18714 1726853421.83066: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18714 1726853421.83069: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853421.83086: getting variables 18714 1726853421.83088: in VariableManager get_vars() 18714 1726853421.83130: Calling all_inventory to load vars for managed_node1 18714 1726853421.83133: Calling groups_inventory to load vars for managed_node1 18714 1726853421.83136: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853421.83148: Calling all_plugins_play to load vars for managed_node1 18714 1726853421.83151: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853421.83153: Calling groups_plugins_play to load vars for managed_node1 18714 1726853421.84081: done sending task result for task 02083763-bbaf-e784-4f7d-00000000002d 18714 1726853421.84084: WORKER PROCESS EXITING 18714 1726853421.85858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853421.87468: done with get_vars() 18714 1726853421.87498: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:30:21 -0400 (0:00:00.080) 0:00:18.259 ****** 18714 1726853421.87600: entering _queue_task() for managed_node1/ping 18714 1726853421.87602: Creating lock for ping 18714 1726853421.88244: worker is 1 (out of 1 available) 18714 1726853421.88254: exiting _queue_task() for managed_node1/ping 18714 1726853421.88266: done queuing things up, now waiting for results queue to drain 18714 1726853421.88267: waiting for pending results... 18714 1726853421.88410: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18714 1726853421.88516: in run() - task 02083763-bbaf-e784-4f7d-00000000002e 18714 1726853421.88537: variable 'ansible_search_path' from source: unknown 18714 1726853421.88545: variable 'ansible_search_path' from source: unknown 18714 1726853421.88591: calling self._execute() 18714 1726853421.88704: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.88722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.88737: variable 'omit' from source: magic vars 18714 1726853421.89138: variable 'ansible_distribution_major_version' from source: facts 18714 1726853421.89166: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853421.89181: variable 'omit' from source: magic vars 18714 1726853421.89229: variable 'omit' from source: magic vars 18714 1726853421.89279: variable 'omit' from source: magic vars 18714 1726853421.89324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853421.89369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853421.89397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853421.89476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853421.89479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853421.89481: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853421.89483: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.89485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.89590: Set connection var ansible_shell_executable to /bin/sh 18714 1726853421.89604: Set connection var ansible_timeout to 10 18714 1726853421.89617: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853421.89630: Set connection var ansible_connection to ssh 18714 1726853421.89641: Set connection var ansible_shell_type to sh 18714 1726853421.89651: Set connection var ansible_pipelining to False 18714 1726853421.89682: variable 'ansible_shell_executable' from source: unknown 18714 1726853421.89698: variable 'ansible_connection' from source: unknown 18714 1726853421.89800: variable 'ansible_module_compression' from source: unknown 18714 1726853421.89803: variable 'ansible_shell_type' from source: unknown 18714 1726853421.89805: variable 'ansible_shell_executable' from source: unknown 18714 1726853421.89808: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853421.89810: variable 'ansible_pipelining' from source: unknown 18714 1726853421.89812: variable 'ansible_timeout' from source: unknown 18714 1726853421.89814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853421.90063: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853421.90140: variable 'omit' from source: magic vars 18714 1726853421.90151: starting attempt loop 18714 1726853421.90158: running the handler 18714 1726853421.90181: _low_level_execute_command(): starting 18714 1726853421.90196: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853421.91668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853421.91686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853421.91699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853421.91940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853421.92089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853421.92416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853421.94213: stdout chunk (state=3): >>>/root <<< 18714 1726853421.94227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853421.94263: stdout chunk (state=3): >>><<< 18714 1726853421.94285: stderr chunk (state=3): >>><<< 18714 1726853421.94343: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853421.94682: _low_level_execute_command(): starting 18714 1726853421.94687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551 `" && echo ansible-tmp-1726853421.9435015-19531-175141869165551="` echo /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551 `" ) && sleep 0' 18714 1726853421.95894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853421.95947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853421.95963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853421.96030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853421.97908: stdout chunk (state=3): >>>ansible-tmp-1726853421.9435015-19531-175141869165551=/root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551 <<< 18714 1726853421.98046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853421.98057: stdout chunk (state=3): >>><<< 18714 1726853421.98082: stderr chunk (state=3): >>><<< 18714 1726853421.98109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853421.9435015-19531-175141869165551=/root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853421.98327: variable 'ansible_module_compression' from source: unknown 18714 1726853421.98379: ANSIBALLZ: Using lock for ping 18714 1726853421.98513: ANSIBALLZ: Acquiring lock 18714 1726853421.98519: ANSIBALLZ: Lock acquired: 139791967632560 18714 1726853421.98526: ANSIBALLZ: Creating module 18714 1726853422.11975: ANSIBALLZ: Writing module into payload 18714 1726853422.12050: ANSIBALLZ: Writing module 18714 1726853422.12080: ANSIBALLZ: Renaming module 18714 1726853422.12101: ANSIBALLZ: Done creating module 18714 1726853422.12124: variable 'ansible_facts' from source: unknown 18714 1726853422.12207: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py 18714 1726853422.12476: Sending initial data 18714 1726853422.12480: Sent initial data (153 bytes) 18714 1726853422.13625: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853422.13732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853422.13906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853422.13935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853422.14014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853422.15685: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853422.15747: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853422.15828: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpiw2o1z7m /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py <<< 18714 1726853422.15844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py" <<< 18714 1726853422.15874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpiw2o1z7m" to remote "/root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py" <<< 18714 1726853422.16619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853422.16658: stderr chunk (state=3): >>><<< 18714 1726853422.16776: stdout chunk (state=3): >>><<< 18714 1726853422.16779: done transferring module to remote 18714 1726853422.16782: _low_level_execute_command(): starting 18714 1726853422.16784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/ /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py && sleep 0' 18714 1726853422.17380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853422.17396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853422.17488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853422.17516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853422.17533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853422.17561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853422.17642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853422.19494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853422.19505: stdout chunk (state=3): >>><<< 18714 1726853422.19517: stderr chunk (state=3): >>><<< 18714 1726853422.19536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853422.19544: _low_level_execute_command(): starting 18714 1726853422.19556: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/AnsiballZ_ping.py && sleep 0' 18714 1726853422.20148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853422.20168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853422.20187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853422.20205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853422.20223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853422.20236: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853422.20250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853422.20363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853422.20367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853422.20416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853422.20467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853422.36490: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18714 1726853422.37689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853422.37708: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 18714 1726853422.37772: stderr chunk (state=3): >>><<< 18714 1726853422.38002: stdout chunk (state=3): >>><<< 18714 1726853422.38006: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853422.38009: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853422.38011: _low_level_execute_command(): starting 18714 1726853422.38013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853421.9435015-19531-175141869165551/ > /dev/null 2>&1 && sleep 0' 18714 1726853422.39092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853422.39374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853422.39392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853422.39407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853422.39475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853422.41577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853422.41581: stderr chunk (state=3): >>><<< 18714 1726853422.41582: stdout chunk (state=3): >>><<< 18714 1726853422.41584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853422.41592: handler run complete 18714 1726853422.41594: attempt loop complete, returning result 18714 1726853422.41596: _execute() done 18714 1726853422.41598: dumping result to json 18714 1726853422.41599: done dumping result, returning 18714 1726853422.41601: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-e784-4f7d-00000000002e] 18714 1726853422.41603: sending task result for task 02083763-bbaf-e784-4f7d-00000000002e 18714 1726853422.41662: done sending task result for task 02083763-bbaf-e784-4f7d-00000000002e 18714 1726853422.41665: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18714 1726853422.41727: no more pending results, returning what we have 18714 1726853422.41730: results queue empty 18714 1726853422.41731: checking for any_errors_fatal 18714 1726853422.41737: done checking for any_errors_fatal 18714 1726853422.41738: checking for max_fail_percentage 18714 1726853422.41740: done checking for max_fail_percentage 18714 1726853422.41741: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.41742: done checking to see if all hosts have failed 18714 1726853422.41742: getting the remaining hosts for this loop 18714 1726853422.41744: done getting the remaining hosts for this loop 18714 1726853422.41748: getting the next task for host managed_node1 18714 1726853422.41760: done getting next task for host managed_node1 18714 1726853422.41763: ^ task is: TASK: meta (role_complete) 18714 1726853422.41764: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.41780: getting variables 18714 1726853422.41783: in VariableManager get_vars() 18714 1726853422.41823: Calling all_inventory to load vars for managed_node1 18714 1726853422.41826: Calling groups_inventory to load vars for managed_node1 18714 1726853422.41829: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.41840: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.41843: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.41846: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.45236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.48597: done with get_vars() 18714 1726853422.48631: done getting variables 18714 1726853422.48836: done queuing things up, now waiting for results queue to drain 18714 1726853422.48838: results queue empty 18714 1726853422.48839: checking for any_errors_fatal 18714 1726853422.48842: done checking for any_errors_fatal 18714 1726853422.48843: checking for max_fail_percentage 18714 1726853422.48844: done checking for max_fail_percentage 18714 1726853422.48845: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.48845: done checking to see if all hosts have failed 18714 1726853422.48846: getting the remaining hosts for this loop 18714 1726853422.48847: done getting the remaining hosts for this loop 18714 1726853422.48850: getting the next task for host managed_node1 18714 1726853422.48976: done getting next task for host managed_node1 18714 1726853422.48980: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18714 1726853422.48982: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.48984: getting variables 18714 1726853422.48985: in VariableManager get_vars() 18714 1726853422.48999: Calling all_inventory to load vars for managed_node1 18714 1726853422.49002: Calling groups_inventory to load vars for managed_node1 18714 1726853422.49004: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.49009: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.49011: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.49014: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.50614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.52280: done with get_vars() 18714 1726853422.52305: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 13:30:22 -0400 (0:00:00.647) 0:00:18.907 ****** 18714 1726853422.52396: entering _queue_task() for managed_node1/include_tasks 18714 1726853422.52894: worker is 1 (out of 1 available) 18714 1726853422.52906: exiting _queue_task() for managed_node1/include_tasks 18714 1726853422.52917: done queuing things up, now waiting for results queue to drain 18714 1726853422.52918: waiting for pending results... 18714 1726853422.53237: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18714 1726853422.53242: in run() - task 02083763-bbaf-e784-4f7d-000000000030 18714 1726853422.53264: variable 'ansible_search_path' from source: unknown 18714 1726853422.53308: calling self._execute() 18714 1726853422.53423: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.53443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.53464: variable 'omit' from source: magic vars 18714 1726853422.54184: variable 'ansible_distribution_major_version' from source: facts 18714 1726853422.54187: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853422.54190: _execute() done 18714 1726853422.54192: dumping result to json 18714 1726853422.54195: done dumping result, returning 18714 1726853422.54197: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [02083763-bbaf-e784-4f7d-000000000030] 18714 1726853422.54200: sending task result for task 02083763-bbaf-e784-4f7d-000000000030 18714 1726853422.54385: no more pending results, returning what we have 18714 1726853422.54390: in VariableManager get_vars() 18714 1726853422.54432: Calling all_inventory to load vars for managed_node1 18714 1726853422.54434: Calling groups_inventory to load vars for managed_node1 18714 1726853422.54437: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.54449: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.54461: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.54465: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.55411: done sending task result for task 02083763-bbaf-e784-4f7d-000000000030 18714 1726853422.55415: WORKER PROCESS EXITING 18714 1726853422.56755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.59088: done with get_vars() 18714 1726853422.59114: variable 'ansible_search_path' from source: unknown 18714 1726853422.59129: we have included files to process 18714 1726853422.59130: generating all_blocks data 18714 1726853422.59132: done generating all_blocks data 18714 1726853422.59136: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18714 1726853422.59138: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18714 1726853422.59140: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18714 1726853422.59512: done processing included file 18714 1726853422.59514: iterating over new_blocks loaded from include file 18714 1726853422.59515: in VariableManager get_vars() 18714 1726853422.59530: done with get_vars() 18714 1726853422.59531: filtering new block on tags 18714 1726853422.59549: done filtering new block on tags 18714 1726853422.59553: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node1 18714 1726853422.59558: extending task lists for all hosts with included blocks 18714 1726853422.59592: done extending task lists 18714 1726853422.59594: done processing included files 18714 1726853422.59595: results queue empty 18714 1726853422.59595: checking for any_errors_fatal 18714 1726853422.59597: done checking for any_errors_fatal 18714 1726853422.59598: checking for max_fail_percentage 18714 1726853422.59599: done checking for max_fail_percentage 18714 1726853422.59600: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.59601: done checking to see if all hosts have failed 18714 1726853422.59601: getting the remaining hosts for this loop 18714 1726853422.59603: done getting the remaining hosts for this loop 18714 1726853422.59605: getting the next task for host managed_node1 18714 1726853422.59609: done getting next task for host managed_node1 18714 1726853422.59612: ^ task is: TASK: Assert that warnings is empty 18714 1726853422.59614: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.59616: getting variables 18714 1726853422.59617: in VariableManager get_vars() 18714 1726853422.59629: Calling all_inventory to load vars for managed_node1 18714 1726853422.59631: Calling groups_inventory to load vars for managed_node1 18714 1726853422.59633: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.59638: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.59641: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.59643: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.60857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.62740: done with get_vars() 18714 1726853422.62777: done getting variables 18714 1726853422.62818: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 13:30:22 -0400 (0:00:00.104) 0:00:19.011 ****** 18714 1726853422.62847: entering _queue_task() for managed_node1/assert 18714 1726853422.63325: worker is 1 (out of 1 available) 18714 1726853422.63337: exiting _queue_task() for managed_node1/assert 18714 1726853422.63347: done queuing things up, now waiting for results queue to drain 18714 1726853422.63348: waiting for pending results... 18714 1726853422.63555: running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty 18714 1726853422.63682: in run() - task 02083763-bbaf-e784-4f7d-000000000304 18714 1726853422.63706: variable 'ansible_search_path' from source: unknown 18714 1726853422.63714: variable 'ansible_search_path' from source: unknown 18714 1726853422.63766: calling self._execute() 18714 1726853422.63884: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.63896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.63917: variable 'omit' from source: magic vars 18714 1726853422.64344: variable 'ansible_distribution_major_version' from source: facts 18714 1726853422.64361: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853422.64370: variable 'omit' from source: magic vars 18714 1726853422.64450: variable 'omit' from source: magic vars 18714 1726853422.64458: variable 'omit' from source: magic vars 18714 1726853422.64506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853422.64543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853422.64574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853422.64594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853422.64616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853422.64667: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853422.64670: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.64673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.64827: Set connection var ansible_shell_executable to /bin/sh 18714 1726853422.64832: Set connection var ansible_timeout to 10 18714 1726853422.64835: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853422.64836: Set connection var ansible_connection to ssh 18714 1726853422.64838: Set connection var ansible_shell_type to sh 18714 1726853422.64840: Set connection var ansible_pipelining to False 18714 1726853422.64841: variable 'ansible_shell_executable' from source: unknown 18714 1726853422.64843: variable 'ansible_connection' from source: unknown 18714 1726853422.64845: variable 'ansible_module_compression' from source: unknown 18714 1726853422.64847: variable 'ansible_shell_type' from source: unknown 18714 1726853422.64860: variable 'ansible_shell_executable' from source: unknown 18714 1726853422.64866: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.64874: variable 'ansible_pipelining' from source: unknown 18714 1726853422.64880: variable 'ansible_timeout' from source: unknown 18714 1726853422.64939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.65032: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853422.65060: variable 'omit' from source: magic vars 18714 1726853422.65077: starting attempt loop 18714 1726853422.65085: running the handler 18714 1726853422.65226: variable '__network_connections_result' from source: set_fact 18714 1726853422.65246: Evaluated conditional ('warnings' not in __network_connections_result): True 18714 1726853422.65259: handler run complete 18714 1726853422.65292: attempt loop complete, returning result 18714 1726853422.65379: _execute() done 18714 1726853422.65382: dumping result to json 18714 1726853422.65384: done dumping result, returning 18714 1726853422.65386: done running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty [02083763-bbaf-e784-4f7d-000000000304] 18714 1726853422.65390: sending task result for task 02083763-bbaf-e784-4f7d-000000000304 18714 1726853422.65460: done sending task result for task 02083763-bbaf-e784-4f7d-000000000304 18714 1726853422.65463: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18714 1726853422.65521: no more pending results, returning what we have 18714 1726853422.65525: results queue empty 18714 1726853422.65526: checking for any_errors_fatal 18714 1726853422.65528: done checking for any_errors_fatal 18714 1726853422.65529: checking for max_fail_percentage 18714 1726853422.65530: done checking for max_fail_percentage 18714 1726853422.65531: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.65532: done checking to see if all hosts have failed 18714 1726853422.65533: getting the remaining hosts for this loop 18714 1726853422.65534: done getting the remaining hosts for this loop 18714 1726853422.65538: getting the next task for host managed_node1 18714 1726853422.65545: done getting next task for host managed_node1 18714 1726853422.65548: ^ task is: TASK: Assert that there is output in stderr 18714 1726853422.65554: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.65559: getting variables 18714 1726853422.65560: in VariableManager get_vars() 18714 1726853422.65604: Calling all_inventory to load vars for managed_node1 18714 1726853422.65607: Calling groups_inventory to load vars for managed_node1 18714 1726853422.65610: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.65623: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.65626: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.65629: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.73848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.75580: done with get_vars() 18714 1726853422.75616: done getting variables 18714 1726853422.75674: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 13:30:22 -0400 (0:00:00.128) 0:00:19.140 ****** 18714 1726853422.75704: entering _queue_task() for managed_node1/assert 18714 1726853422.76304: worker is 1 (out of 1 available) 18714 1726853422.76316: exiting _queue_task() for managed_node1/assert 18714 1726853422.76327: done queuing things up, now waiting for results queue to drain 18714 1726853422.76328: waiting for pending results... 18714 1726853422.76567: running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr 18714 1726853422.76575: in run() - task 02083763-bbaf-e784-4f7d-000000000305 18714 1726853422.76578: variable 'ansible_search_path' from source: unknown 18714 1726853422.76581: variable 'ansible_search_path' from source: unknown 18714 1726853422.76602: calling self._execute() 18714 1726853422.76711: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.76723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.76738: variable 'omit' from source: magic vars 18714 1726853422.77136: variable 'ansible_distribution_major_version' from source: facts 18714 1726853422.77151: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853422.77160: variable 'omit' from source: magic vars 18714 1726853422.77204: variable 'omit' from source: magic vars 18714 1726853422.77312: variable 'omit' from source: magic vars 18714 1726853422.77315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853422.77327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853422.77351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853422.77369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853422.77386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853422.77426: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853422.77435: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.77444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.77565: Set connection var ansible_shell_executable to /bin/sh 18714 1726853422.77581: Set connection var ansible_timeout to 10 18714 1726853422.77591: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853422.77602: Set connection var ansible_connection to ssh 18714 1726853422.77611: Set connection var ansible_shell_type to sh 18714 1726853422.77638: Set connection var ansible_pipelining to False 18714 1726853422.77653: variable 'ansible_shell_executable' from source: unknown 18714 1726853422.77670: variable 'ansible_connection' from source: unknown 18714 1726853422.77675: variable 'ansible_module_compression' from source: unknown 18714 1726853422.77747: variable 'ansible_shell_type' from source: unknown 18714 1726853422.77750: variable 'ansible_shell_executable' from source: unknown 18714 1726853422.77752: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.77755: variable 'ansible_pipelining' from source: unknown 18714 1726853422.77757: variable 'ansible_timeout' from source: unknown 18714 1726853422.77759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.77859: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853422.77878: variable 'omit' from source: magic vars 18714 1726853422.77893: starting attempt loop 18714 1726853422.77900: running the handler 18714 1726853422.78043: variable '__network_connections_result' from source: set_fact 18714 1726853422.78062: Evaluated conditional ('stderr' in __network_connections_result): True 18714 1726853422.78082: handler run complete 18714 1726853422.78106: attempt loop complete, returning result 18714 1726853422.78180: _execute() done 18714 1726853422.78183: dumping result to json 18714 1726853422.78186: done dumping result, returning 18714 1726853422.78188: done running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr [02083763-bbaf-e784-4f7d-000000000305] 18714 1726853422.78191: sending task result for task 02083763-bbaf-e784-4f7d-000000000305 18714 1726853422.78261: done sending task result for task 02083763-bbaf-e784-4f7d-000000000305 18714 1726853422.78264: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18714 1726853422.78325: no more pending results, returning what we have 18714 1726853422.78328: results queue empty 18714 1726853422.78329: checking for any_errors_fatal 18714 1726853422.78338: done checking for any_errors_fatal 18714 1726853422.78339: checking for max_fail_percentage 18714 1726853422.78340: done checking for max_fail_percentage 18714 1726853422.78341: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.78342: done checking to see if all hosts have failed 18714 1726853422.78343: getting the remaining hosts for this loop 18714 1726853422.78344: done getting the remaining hosts for this loop 18714 1726853422.78348: getting the next task for host managed_node1 18714 1726853422.78358: done getting next task for host managed_node1 18714 1726853422.78361: ^ task is: TASK: meta (flush_handlers) 18714 1726853422.78363: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.78367: getting variables 18714 1726853422.78369: in VariableManager get_vars() 18714 1726853422.78496: Calling all_inventory to load vars for managed_node1 18714 1726853422.78614: Calling groups_inventory to load vars for managed_node1 18714 1726853422.78618: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.78631: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.78634: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.78637: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.80144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.82008: done with get_vars() 18714 1726853422.82035: done getting variables 18714 1726853422.82105: in VariableManager get_vars() 18714 1726853422.82117: Calling all_inventory to load vars for managed_node1 18714 1726853422.82119: Calling groups_inventory to load vars for managed_node1 18714 1726853422.82121: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.82126: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.82128: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.82131: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.83312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.84950: done with get_vars() 18714 1726853422.84987: done queuing things up, now waiting for results queue to drain 18714 1726853422.84989: results queue empty 18714 1726853422.84990: checking for any_errors_fatal 18714 1726853422.84993: done checking for any_errors_fatal 18714 1726853422.84994: checking for max_fail_percentage 18714 1726853422.84995: done checking for max_fail_percentage 18714 1726853422.84995: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.84996: done checking to see if all hosts have failed 18714 1726853422.84997: getting the remaining hosts for this loop 18714 1726853422.85008: done getting the remaining hosts for this loop 18714 1726853422.85012: getting the next task for host managed_node1 18714 1726853422.85016: done getting next task for host managed_node1 18714 1726853422.85018: ^ task is: TASK: meta (flush_handlers) 18714 1726853422.85019: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.85021: getting variables 18714 1726853422.85022: in VariableManager get_vars() 18714 1726853422.85035: Calling all_inventory to load vars for managed_node1 18714 1726853422.85037: Calling groups_inventory to load vars for managed_node1 18714 1726853422.85038: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.85043: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.85045: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.85048: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.86376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.88009: done with get_vars() 18714 1726853422.88030: done getting variables 18714 1726853422.88080: in VariableManager get_vars() 18714 1726853422.88091: Calling all_inventory to load vars for managed_node1 18714 1726853422.88095: Calling groups_inventory to load vars for managed_node1 18714 1726853422.88096: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.88101: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.88103: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.88105: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.89269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.90886: done with get_vars() 18714 1726853422.90922: done queuing things up, now waiting for results queue to drain 18714 1726853422.90925: results queue empty 18714 1726853422.90926: checking for any_errors_fatal 18714 1726853422.90927: done checking for any_errors_fatal 18714 1726853422.90928: checking for max_fail_percentage 18714 1726853422.90929: done checking for max_fail_percentage 18714 1726853422.90930: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.90930: done checking to see if all hosts have failed 18714 1726853422.90931: getting the remaining hosts for this loop 18714 1726853422.90932: done getting the remaining hosts for this loop 18714 1726853422.90935: getting the next task for host managed_node1 18714 1726853422.90939: done getting next task for host managed_node1 18714 1726853422.90940: ^ task is: None 18714 1726853422.90941: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.90943: done queuing things up, now waiting for results queue to drain 18714 1726853422.90944: results queue empty 18714 1726853422.90944: checking for any_errors_fatal 18714 1726853422.90945: done checking for any_errors_fatal 18714 1726853422.90946: checking for max_fail_percentage 18714 1726853422.90947: done checking for max_fail_percentage 18714 1726853422.90947: checking to see if all hosts have failed and the running result is not ok 18714 1726853422.90948: done checking to see if all hosts have failed 18714 1726853422.90949: getting the next task for host managed_node1 18714 1726853422.90951: done getting next task for host managed_node1 18714 1726853422.90952: ^ task is: None 18714 1726853422.90953: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.91003: in VariableManager get_vars() 18714 1726853422.91020: done with get_vars() 18714 1726853422.91026: in VariableManager get_vars() 18714 1726853422.91037: done with get_vars() 18714 1726853422.91043: variable 'omit' from source: magic vars 18714 1726853422.91076: in VariableManager get_vars() 18714 1726853422.91088: done with get_vars() 18714 1726853422.91126: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18714 1726853422.91313: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853422.91372: getting the remaining hosts for this loop 18714 1726853422.91374: done getting the remaining hosts for this loop 18714 1726853422.91376: getting the next task for host managed_node1 18714 1726853422.91379: done getting next task for host managed_node1 18714 1726853422.91381: ^ task is: TASK: Gathering Facts 18714 1726853422.91382: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853422.91385: getting variables 18714 1726853422.91386: in VariableManager get_vars() 18714 1726853422.91394: Calling all_inventory to load vars for managed_node1 18714 1726853422.91411: Calling groups_inventory to load vars for managed_node1 18714 1726853422.91414: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853422.91420: Calling all_plugins_play to load vars for managed_node1 18714 1726853422.91422: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853422.91425: Calling groups_plugins_play to load vars for managed_node1 18714 1726853422.92765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853422.94450: done with get_vars() 18714 1726853422.94479: done getting variables 18714 1726853422.94539: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 13:30:22 -0400 (0:00:00.188) 0:00:19.329 ****** 18714 1726853422.94578: entering _queue_task() for managed_node1/gather_facts 18714 1726853422.95023: worker is 1 (out of 1 available) 18714 1726853422.95037: exiting _queue_task() for managed_node1/gather_facts 18714 1726853422.95046: done queuing things up, now waiting for results queue to drain 18714 1726853422.95047: waiting for pending results... 18714 1726853422.95486: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853422.95490: in run() - task 02083763-bbaf-e784-4f7d-000000000316 18714 1726853422.95494: variable 'ansible_search_path' from source: unknown 18714 1726853422.95497: calling self._execute() 18714 1726853422.95577: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.95588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.95603: variable 'omit' from source: magic vars 18714 1726853422.96015: variable 'ansible_distribution_major_version' from source: facts 18714 1726853422.96031: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853422.96047: variable 'omit' from source: magic vars 18714 1726853422.96082: variable 'omit' from source: magic vars 18714 1726853422.96120: variable 'omit' from source: magic vars 18714 1726853422.96477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853422.96482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853422.96484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853422.96486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853422.96586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853422.96589: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853422.96592: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.96594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.96711: Set connection var ansible_shell_executable to /bin/sh 18714 1726853422.96756: Set connection var ansible_timeout to 10 18714 1726853422.96811: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853422.96822: Set connection var ansible_connection to ssh 18714 1726853422.96830: Set connection var ansible_shell_type to sh 18714 1726853422.96899: Set connection var ansible_pipelining to False 18714 1726853422.96944: variable 'ansible_shell_executable' from source: unknown 18714 1726853422.97129: variable 'ansible_connection' from source: unknown 18714 1726853422.97132: variable 'ansible_module_compression' from source: unknown 18714 1726853422.97134: variable 'ansible_shell_type' from source: unknown 18714 1726853422.97136: variable 'ansible_shell_executable' from source: unknown 18714 1726853422.97138: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853422.97140: variable 'ansible_pipelining' from source: unknown 18714 1726853422.97142: variable 'ansible_timeout' from source: unknown 18714 1726853422.97144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853422.97411: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853422.97681: variable 'omit' from source: magic vars 18714 1726853422.97684: starting attempt loop 18714 1726853422.97687: running the handler 18714 1726853422.97689: variable 'ansible_facts' from source: unknown 18714 1726853422.97691: _low_level_execute_command(): starting 18714 1726853422.97694: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853422.99112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853422.99213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853422.99242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853422.99267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853422.99443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853423.01138: stdout chunk (state=3): >>>/root <<< 18714 1726853423.01282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853423.01294: stdout chunk (state=3): >>><<< 18714 1726853423.01308: stderr chunk (state=3): >>><<< 18714 1726853423.01540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853423.01543: _low_level_execute_command(): starting 18714 1726853423.01546: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355 `" && echo ansible-tmp-1726853423.0144994-19583-134363593786355="` echo /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355 `" ) && sleep 0' 18714 1726853423.02435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853423.02531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853423.02637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853423.02769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853423.02809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853423.04721: stdout chunk (state=3): >>>ansible-tmp-1726853423.0144994-19583-134363593786355=/root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355 <<< 18714 1726853423.04884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853423.04921: stdout chunk (state=3): >>><<< 18714 1726853423.05028: stderr chunk (state=3): >>><<< 18714 1726853423.05033: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853423.0144994-19583-134363593786355=/root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853423.05103: variable 'ansible_module_compression' from source: unknown 18714 1726853423.05279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853423.05587: variable 'ansible_facts' from source: unknown 18714 1726853423.05772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py 18714 1726853423.06291: Sending initial data 18714 1726853423.06303: Sent initial data (154 bytes) 18714 1726853423.07397: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853423.07412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853423.07434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853423.07534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853423.07777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853423.07859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853423.09434: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18714 1726853423.09451: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853423.09521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853423.09569: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpdjcufzc_ /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py <<< 18714 1726853423.09583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py" <<< 18714 1726853423.09682: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpdjcufzc_" to remote "/root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py" <<< 18714 1726853423.09695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py" <<< 18714 1726853423.12882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853423.12886: stderr chunk (state=3): >>><<< 18714 1726853423.12888: stdout chunk (state=3): >>><<< 18714 1726853423.12891: done transferring module to remote 18714 1726853423.12893: _low_level_execute_command(): starting 18714 1726853423.12895: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/ /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py && sleep 0' 18714 1726853423.14160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853423.14164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853423.14180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853423.14241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853423.14775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853423.16421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853423.16429: stderr chunk (state=3): >>><<< 18714 1726853423.16432: stdout chunk (state=3): >>><<< 18714 1726853423.16434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853423.16437: _low_level_execute_command(): starting 18714 1726853423.16439: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/AnsiballZ_setup.py && sleep 0' 18714 1726853423.17068: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853423.17085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853423.17098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853423.17113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853423.17128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853423.17160: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853423.17261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853423.17296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853423.17349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853423.83905: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.39306640625, "5m": 0.35400390625, "15m": 0.17138671875}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "23", "epoch": "1726853423", "epoch_int": "1726853423", "date": "2024-09-20", "time": "13:30:23", "iso8601_micro": "2024-09-20T17:30:23.444252Z", "iso8601": "2024-09-20T17:30:23Z", "iso8601_basic": "20240920T133023444252", "iso8601_basic_short": "20240920T133023", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_R<<< 18714 1726853423.83913: stdout chunk (state=3): >>>UNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3306, "used": 225}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 589, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794824192, "block_size": 4096, "block_total": 65519099, "block_available": 63914752, "block_used": 1604347, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_interfaces": ["lsr27", "eth0", "lo", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853423.86061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853423.86076: stdout chunk (state=3): >>><<< 18714 1726853423.86278: stderr chunk (state=3): >>><<< 18714 1726853423.86284: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.39306640625, "5m": 0.35400390625, "15m": 0.17138671875}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "23", "epoch": "1726853423", "epoch_int": "1726853423", "date": "2024-09-20", "time": "13:30:23", "iso8601_micro": "2024-09-20T17:30:23.444252Z", "iso8601": "2024-09-20T17:30:23Z", "iso8601_basic": "20240920T133023444252", "iso8601_basic_short": "20240920T133023", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3306, "used": 225}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 589, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794824192, "block_size": 4096, "block_total": 65519099, "block_available": 63914752, "block_used": 1604347, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_interfaces": ["lsr27", "eth0", "lo", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853423.86646: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853423.86679: _low_level_execute_command(): starting 18714 1726853423.86690: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853423.0144994-19583-134363593786355/ > /dev/null 2>&1 && sleep 0' 18714 1726853423.87376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853423.87403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853423.87420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853423.87441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853423.87520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853423.87566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853423.87593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853423.87628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853423.87698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853423.89629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853423.89632: stdout chunk (state=3): >>><<< 18714 1726853423.89634: stderr chunk (state=3): >>><<< 18714 1726853423.89777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853423.89782: handler run complete 18714 1726853423.89837: variable 'ansible_facts' from source: unknown 18714 1726853423.89953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853423.90422: variable 'ansible_facts' from source: unknown 18714 1726853423.90878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853423.91034: attempt loop complete, returning result 18714 1726853423.91043: _execute() done 18714 1726853423.91054: dumping result to json 18714 1726853423.91099: done dumping result, returning 18714 1726853423.91138: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-000000000316] 18714 1726853423.91183: sending task result for task 02083763-bbaf-e784-4f7d-000000000316 18714 1726853423.91835: done sending task result for task 02083763-bbaf-e784-4f7d-000000000316 18714 1726853423.91839: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853423.92170: no more pending results, returning what we have 18714 1726853423.92175: results queue empty 18714 1726853423.92176: checking for any_errors_fatal 18714 1726853423.92178: done checking for any_errors_fatal 18714 1726853423.92179: checking for max_fail_percentage 18714 1726853423.92180: done checking for max_fail_percentage 18714 1726853423.92181: checking to see if all hosts have failed and the running result is not ok 18714 1726853423.92182: done checking to see if all hosts have failed 18714 1726853423.92182: getting the remaining hosts for this loop 18714 1726853423.92183: done getting the remaining hosts for this loop 18714 1726853423.92187: getting the next task for host managed_node1 18714 1726853423.92192: done getting next task for host managed_node1 18714 1726853423.92193: ^ task is: TASK: meta (flush_handlers) 18714 1726853423.92195: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853423.92206: getting variables 18714 1726853423.92207: in VariableManager get_vars() 18714 1726853423.92232: Calling all_inventory to load vars for managed_node1 18714 1726853423.92235: Calling groups_inventory to load vars for managed_node1 18714 1726853423.92238: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853423.92248: Calling all_plugins_play to load vars for managed_node1 18714 1726853423.92251: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853423.92254: Calling groups_plugins_play to load vars for managed_node1 18714 1726853423.93860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853423.95550: done with get_vars() 18714 1726853423.95581: done getting variables 18714 1726853423.95650: in VariableManager get_vars() 18714 1726853423.95659: Calling all_inventory to load vars for managed_node1 18714 1726853423.95662: Calling groups_inventory to load vars for managed_node1 18714 1726853423.95664: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853423.95674: Calling all_plugins_play to load vars for managed_node1 18714 1726853423.95677: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853423.95680: Calling groups_plugins_play to load vars for managed_node1 18714 1726853423.96954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853423.99607: done with get_vars() 18714 1726853423.99640: done queuing things up, now waiting for results queue to drain 18714 1726853423.99642: results queue empty 18714 1726853423.99643: checking for any_errors_fatal 18714 1726853423.99647: done checking for any_errors_fatal 18714 1726853423.99653: checking for max_fail_percentage 18714 1726853423.99654: done checking for max_fail_percentage 18714 1726853423.99655: checking to see if all hosts have failed and the running result is not ok 18714 1726853423.99655: done checking to see if all hosts have failed 18714 1726853423.99656: getting the remaining hosts for this loop 18714 1726853423.99657: done getting the remaining hosts for this loop 18714 1726853423.99660: getting the next task for host managed_node1 18714 1726853423.99664: done getting next task for host managed_node1 18714 1726853423.99666: ^ task is: TASK: Show network_provider 18714 1726853423.99668: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853423.99670: getting variables 18714 1726853423.99899: in VariableManager get_vars() 18714 1726853423.99913: Calling all_inventory to load vars for managed_node1 18714 1726853423.99916: Calling groups_inventory to load vars for managed_node1 18714 1726853423.99918: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853423.99923: Calling all_plugins_play to load vars for managed_node1 18714 1726853423.99925: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853423.99928: Calling groups_plugins_play to load vars for managed_node1 18714 1726853424.02359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853424.07557: done with get_vars() 18714 1726853424.07992: done getting variables 18714 1726853424.08042: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 13:30:24 -0400 (0:00:01.134) 0:00:20.464 ****** 18714 1726853424.08078: entering _queue_task() for managed_node1/debug 18714 1726853424.09430: worker is 1 (out of 1 available) 18714 1726853424.09442: exiting _queue_task() for managed_node1/debug 18714 1726853424.09456: done queuing things up, now waiting for results queue to drain 18714 1726853424.09458: waiting for pending results... 18714 1726853424.10292: running TaskExecutor() for managed_node1/TASK: Show network_provider 18714 1726853424.10578: in run() - task 02083763-bbaf-e784-4f7d-000000000033 18714 1726853424.10629: variable 'ansible_search_path' from source: unknown 18714 1726853424.10652: calling self._execute() 18714 1726853424.11064: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853424.11067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853424.11078: variable 'omit' from source: magic vars 18714 1726853424.12038: variable 'ansible_distribution_major_version' from source: facts 18714 1726853424.12135: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853424.12147: variable 'omit' from source: magic vars 18714 1726853424.12241: variable 'omit' from source: magic vars 18714 1726853424.12391: variable 'omit' from source: magic vars 18714 1726853424.12437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853424.12665: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853424.12669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853424.12675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853424.12691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853424.12726: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853424.12734: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853424.12781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853424.12987: Set connection var ansible_shell_executable to /bin/sh 18714 1726853424.13000: Set connection var ansible_timeout to 10 18714 1726853424.13009: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853424.13019: Set connection var ansible_connection to ssh 18714 1726853424.13204: Set connection var ansible_shell_type to sh 18714 1726853424.13208: Set connection var ansible_pipelining to False 18714 1726853424.13210: variable 'ansible_shell_executable' from source: unknown 18714 1726853424.13212: variable 'ansible_connection' from source: unknown 18714 1726853424.13214: variable 'ansible_module_compression' from source: unknown 18714 1726853424.13216: variable 'ansible_shell_type' from source: unknown 18714 1726853424.13218: variable 'ansible_shell_executable' from source: unknown 18714 1726853424.13220: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853424.13221: variable 'ansible_pipelining' from source: unknown 18714 1726853424.13223: variable 'ansible_timeout' from source: unknown 18714 1726853424.13225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853424.13453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853424.13639: variable 'omit' from source: magic vars 18714 1726853424.13642: starting attempt loop 18714 1726853424.13644: running the handler 18714 1726853424.13646: variable 'network_provider' from source: set_fact 18714 1726853424.13791: variable 'network_provider' from source: set_fact 18714 1726853424.13806: handler run complete 18714 1726853424.13875: attempt loop complete, returning result 18714 1726853424.13883: _execute() done 18714 1726853424.13890: dumping result to json 18714 1726853424.13896: done dumping result, returning 18714 1726853424.13907: done running TaskExecutor() for managed_node1/TASK: Show network_provider [02083763-bbaf-e784-4f7d-000000000033] 18714 1726853424.14073: sending task result for task 02083763-bbaf-e784-4f7d-000000000033 18714 1726853424.14143: done sending task result for task 02083763-bbaf-e784-4f7d-000000000033 18714 1726853424.14146: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 18714 1726853424.14209: no more pending results, returning what we have 18714 1726853424.14212: results queue empty 18714 1726853424.14214: checking for any_errors_fatal 18714 1726853424.14215: done checking for any_errors_fatal 18714 1726853424.14216: checking for max_fail_percentage 18714 1726853424.14218: done checking for max_fail_percentage 18714 1726853424.14219: checking to see if all hosts have failed and the running result is not ok 18714 1726853424.14220: done checking to see if all hosts have failed 18714 1726853424.14220: getting the remaining hosts for this loop 18714 1726853424.14222: done getting the remaining hosts for this loop 18714 1726853424.14225: getting the next task for host managed_node1 18714 1726853424.14233: done getting next task for host managed_node1 18714 1726853424.14235: ^ task is: TASK: meta (flush_handlers) 18714 1726853424.14237: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853424.14241: getting variables 18714 1726853424.14243: in VariableManager get_vars() 18714 1726853424.14275: Calling all_inventory to load vars for managed_node1 18714 1726853424.14277: Calling groups_inventory to load vars for managed_node1 18714 1726853424.14281: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853424.14291: Calling all_plugins_play to load vars for managed_node1 18714 1726853424.14294: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853424.14296: Calling groups_plugins_play to load vars for managed_node1 18714 1726853424.17737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853424.22024: done with get_vars() 18714 1726853424.22050: done getting variables 18714 1726853424.22428: in VariableManager get_vars() 18714 1726853424.22440: Calling all_inventory to load vars for managed_node1 18714 1726853424.22442: Calling groups_inventory to load vars for managed_node1 18714 1726853424.22444: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853424.22449: Calling all_plugins_play to load vars for managed_node1 18714 1726853424.22454: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853424.22457: Calling groups_plugins_play to load vars for managed_node1 18714 1726853424.24867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853424.28235: done with get_vars() 18714 1726853424.28267: done queuing things up, now waiting for results queue to drain 18714 1726853424.28269: results queue empty 18714 1726853424.28270: checking for any_errors_fatal 18714 1726853424.28274: done checking for any_errors_fatal 18714 1726853424.28275: checking for max_fail_percentage 18714 1726853424.28276: done checking for max_fail_percentage 18714 1726853424.28276: checking to see if all hosts have failed and the running result is not ok 18714 1726853424.28277: done checking to see if all hosts have failed 18714 1726853424.28278: getting the remaining hosts for this loop 18714 1726853424.28279: done getting the remaining hosts for this loop 18714 1726853424.28281: getting the next task for host managed_node1 18714 1726853424.28289: done getting next task for host managed_node1 18714 1726853424.28290: ^ task is: TASK: meta (flush_handlers) 18714 1726853424.28291: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853424.28294: getting variables 18714 1726853424.28295: in VariableManager get_vars() 18714 1726853424.28303: Calling all_inventory to load vars for managed_node1 18714 1726853424.28305: Calling groups_inventory to load vars for managed_node1 18714 1726853424.28308: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853424.28313: Calling all_plugins_play to load vars for managed_node1 18714 1726853424.28315: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853424.28318: Calling groups_plugins_play to load vars for managed_node1 18714 1726853424.31418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853424.33608: done with get_vars() 18714 1726853424.33632: done getting variables 18714 1726853424.33688: in VariableManager get_vars() 18714 1726853424.33699: Calling all_inventory to load vars for managed_node1 18714 1726853424.33702: Calling groups_inventory to load vars for managed_node1 18714 1726853424.33704: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853424.33709: Calling all_plugins_play to load vars for managed_node1 18714 1726853424.33712: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853424.33714: Calling groups_plugins_play to load vars for managed_node1 18714 1726853424.34855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853424.37761: done with get_vars() 18714 1726853424.37791: done queuing things up, now waiting for results queue to drain 18714 1726853424.37793: results queue empty 18714 1726853424.37794: checking for any_errors_fatal 18714 1726853424.37796: done checking for any_errors_fatal 18714 1726853424.37796: checking for max_fail_percentage 18714 1726853424.37797: done checking for max_fail_percentage 18714 1726853424.37798: checking to see if all hosts have failed and the running result is not ok 18714 1726853424.37799: done checking to see if all hosts have failed 18714 1726853424.37799: getting the remaining hosts for this loop 18714 1726853424.37800: done getting the remaining hosts for this loop 18714 1726853424.37803: getting the next task for host managed_node1 18714 1726853424.37806: done getting next task for host managed_node1 18714 1726853424.37806: ^ task is: None 18714 1726853424.37808: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853424.37809: done queuing things up, now waiting for results queue to drain 18714 1726853424.37810: results queue empty 18714 1726853424.37810: checking for any_errors_fatal 18714 1726853424.37811: done checking for any_errors_fatal 18714 1726853424.37811: checking for max_fail_percentage 18714 1726853424.37812: done checking for max_fail_percentage 18714 1726853424.37813: checking to see if all hosts have failed and the running result is not ok 18714 1726853424.37814: done checking to see if all hosts have failed 18714 1726853424.37815: getting the next task for host managed_node1 18714 1726853424.37817: done getting next task for host managed_node1 18714 1726853424.37817: ^ task is: None 18714 1726853424.37818: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853424.37886: in VariableManager get_vars() 18714 1726853424.37907: done with get_vars() 18714 1726853424.37912: in VariableManager get_vars() 18714 1726853424.37924: done with get_vars() 18714 1726853424.37928: variable 'omit' from source: magic vars 18714 1726853424.38037: variable 'profile' from source: play vars 18714 1726853424.38348: in VariableManager get_vars() 18714 1726853424.38362: done with get_vars() 18714 1726853424.38587: variable 'omit' from source: magic vars 18714 1726853424.38651: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18714 1726853424.39493: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853424.39515: getting the remaining hosts for this loop 18714 1726853424.39517: done getting the remaining hosts for this loop 18714 1726853424.39519: getting the next task for host managed_node1 18714 1726853424.39522: done getting next task for host managed_node1 18714 1726853424.39524: ^ task is: TASK: Gathering Facts 18714 1726853424.39525: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853424.39527: getting variables 18714 1726853424.39528: in VariableManager get_vars() 18714 1726853424.39539: Calling all_inventory to load vars for managed_node1 18714 1726853424.39541: Calling groups_inventory to load vars for managed_node1 18714 1726853424.39543: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853424.39548: Calling all_plugins_play to load vars for managed_node1 18714 1726853424.39551: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853424.39553: Calling groups_plugins_play to load vars for managed_node1 18714 1726853424.40769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853424.42268: done with get_vars() 18714 1726853424.42289: done getting variables 18714 1726853424.42334: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:30:24 -0400 (0:00:00.342) 0:00:20.807 ****** 18714 1726853424.42363: entering _queue_task() for managed_node1/gather_facts 18714 1726853424.43113: worker is 1 (out of 1 available) 18714 1726853424.43122: exiting _queue_task() for managed_node1/gather_facts 18714 1726853424.43134: done queuing things up, now waiting for results queue to drain 18714 1726853424.43135: waiting for pending results... 18714 1726853424.43489: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853424.43777: in run() - task 02083763-bbaf-e784-4f7d-00000000032b 18714 1726853424.43781: variable 'ansible_search_path' from source: unknown 18714 1726853424.43784: calling self._execute() 18714 1726853424.43960: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853424.43964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853424.44078: variable 'omit' from source: magic vars 18714 1726853424.44826: variable 'ansible_distribution_major_version' from source: facts 18714 1726853424.44979: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853424.44984: variable 'omit' from source: magic vars 18714 1726853424.44998: variable 'omit' from source: magic vars 18714 1726853424.45042: variable 'omit' from source: magic vars 18714 1726853424.45134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853424.45184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853424.45211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853424.45229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853424.45242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853424.45281: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853424.45291: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853424.45301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853424.45413: Set connection var ansible_shell_executable to /bin/sh 18714 1726853424.45428: Set connection var ansible_timeout to 10 18714 1726853424.45440: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853424.45454: Set connection var ansible_connection to ssh 18714 1726853424.45469: Set connection var ansible_shell_type to sh 18714 1726853424.45481: Set connection var ansible_pipelining to False 18714 1726853424.45506: variable 'ansible_shell_executable' from source: unknown 18714 1726853424.45512: variable 'ansible_connection' from source: unknown 18714 1726853424.45522: variable 'ansible_module_compression' from source: unknown 18714 1726853424.45528: variable 'ansible_shell_type' from source: unknown 18714 1726853424.45533: variable 'ansible_shell_executable' from source: unknown 18714 1726853424.45578: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853424.45581: variable 'ansible_pipelining' from source: unknown 18714 1726853424.45583: variable 'ansible_timeout' from source: unknown 18714 1726853424.45585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853424.45747: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853424.45767: variable 'omit' from source: magic vars 18714 1726853424.45780: starting attempt loop 18714 1726853424.45786: running the handler 18714 1726853424.45806: variable 'ansible_facts' from source: unknown 18714 1726853424.45826: _low_level_execute_command(): starting 18714 1726853424.45957: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853424.46731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853424.46762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853424.46849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853424.48659: stdout chunk (state=3): >>>/root <<< 18714 1726853424.48692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853424.48696: stdout chunk (state=3): >>><<< 18714 1726853424.48698: stderr chunk (state=3): >>><<< 18714 1726853424.48830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853424.48833: _low_level_execute_command(): starting 18714 1726853424.48837: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649 `" && echo ansible-tmp-1726853424.4878118-19662-240776683509649="` echo /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649 `" ) && sleep 0' 18714 1726853424.49469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853424.49579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853424.49585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853424.49596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853424.49602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853424.49644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853424.49733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853424.51646: stdout chunk (state=3): >>>ansible-tmp-1726853424.4878118-19662-240776683509649=/root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649 <<< 18714 1726853424.51787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853424.51806: stdout chunk (state=3): >>><<< 18714 1726853424.51824: stderr chunk (state=3): >>><<< 18714 1726853424.51856: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853424.4878118-19662-240776683509649=/root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853424.51956: variable 'ansible_module_compression' from source: unknown 18714 1726853424.51959: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853424.52029: variable 'ansible_facts' from source: unknown 18714 1726853424.52219: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py 18714 1726853424.52410: Sending initial data 18714 1726853424.52503: Sent initial data (154 bytes) 18714 1726853424.53205: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853424.53218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853424.53233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853424.53266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853424.53348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853424.53406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853424.53476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853424.55006: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853424.55038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853424.55207: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp1s2tdgwm /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py <<< 18714 1726853424.55211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py" <<< 18714 1726853424.55375: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp1s2tdgwm" to remote "/root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py" <<< 18714 1726853424.57149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853424.57184: stderr chunk (state=3): >>><<< 18714 1726853424.57191: stdout chunk (state=3): >>><<< 18714 1726853424.57215: done transferring module to remote 18714 1726853424.57233: _low_level_execute_command(): starting 18714 1726853424.57242: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/ /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py && sleep 0' 18714 1726853424.57835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853424.57888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853424.57947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853424.57959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853424.57982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853424.58119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853424.59912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853424.60076: stdout chunk (state=3): >>><<< 18714 1726853424.60080: stderr chunk (state=3): >>><<< 18714 1726853424.60177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853424.60181: _low_level_execute_command(): starting 18714 1726853424.60183: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/AnsiballZ_setup.py && sleep 0' 18714 1726853424.60833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853424.60886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853424.60966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853424.60985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853424.61001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853424.61077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853425.25309: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3302, "used": 229}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 591, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794852864, "block_size": 4096, "block_total": 65519099, "block_available": 63914759, "block_used": 1604340, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "25", "epoch": "1726853425", "epoch_int": "1726853425", "date": "2024-09-20", "time": "13:30:25", "iso8601_micro": "2024-09-20T17:30:25.191475Z", "iso8601": "2024-09-20T17:30:25Z", "iso8601_basic": "20240920T133025191475", "iso8601_basic_short": "20240920T133025", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.361328125, "5m": 0.34765625, "15m": 0.17041015625}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "peerlsr27", "lsr27", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]",<<< 18714 1726853425.25345: stdout chunk (state=3): >>> "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853425.27374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853425.27403: stdout chunk (state=3): >>><<< 18714 1726853425.27407: stderr chunk (state=3): >>><<< 18714 1726853425.27527: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3302, "used": 229}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 591, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794852864, "block_size": 4096, "block_total": 65519099, "block_available": 63914759, "block_used": 1604340, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "25", "epoch": "1726853425", "epoch_int": "1726853425", "date": "2024-09-20", "time": "13:30:25", "iso8601_micro": "2024-09-20T17:30:25.191475Z", "iso8601": "2024-09-20T17:30:25Z", "iso8601_basic": "20240920T133025191475", "iso8601_basic_short": "20240920T133025", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.361328125, "5m": 0.34765625, "15m": 0.17041015625}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "peerlsr27", "lsr27", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853425.27943: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853425.27968: _low_level_execute_command(): starting 18714 1726853425.27981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853424.4878118-19662-240776683509649/ > /dev/null 2>&1 && sleep 0' 18714 1726853425.28622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853425.28686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853425.28750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853425.28770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853425.28796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853425.28874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853425.30690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853425.30716: stderr chunk (state=3): >>><<< 18714 1726853425.30725: stdout chunk (state=3): >>><<< 18714 1726853425.30746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853425.30798: handler run complete 18714 1726853425.30922: variable 'ansible_facts' from source: unknown 18714 1726853425.31039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.31413: variable 'ansible_facts' from source: unknown 18714 1726853425.31534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.31702: attempt loop complete, returning result 18714 1726853425.31779: _execute() done 18714 1726853425.31783: dumping result to json 18714 1726853425.31785: done dumping result, returning 18714 1726853425.31787: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-00000000032b] 18714 1726853425.31789: sending task result for task 02083763-bbaf-e784-4f7d-00000000032b 18714 1726853425.32462: done sending task result for task 02083763-bbaf-e784-4f7d-00000000032b 18714 1726853425.32465: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853425.32903: no more pending results, returning what we have 18714 1726853425.32907: results queue empty 18714 1726853425.32908: checking for any_errors_fatal 18714 1726853425.32909: done checking for any_errors_fatal 18714 1726853425.32910: checking for max_fail_percentage 18714 1726853425.32912: done checking for max_fail_percentage 18714 1726853425.32913: checking to see if all hosts have failed and the running result is not ok 18714 1726853425.32913: done checking to see if all hosts have failed 18714 1726853425.32914: getting the remaining hosts for this loop 18714 1726853425.32915: done getting the remaining hosts for this loop 18714 1726853425.32919: getting the next task for host managed_node1 18714 1726853425.32923: done getting next task for host managed_node1 18714 1726853425.32925: ^ task is: TASK: meta (flush_handlers) 18714 1726853425.32927: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853425.32931: getting variables 18714 1726853425.32933: in VariableManager get_vars() 18714 1726853425.32960: Calling all_inventory to load vars for managed_node1 18714 1726853425.32963: Calling groups_inventory to load vars for managed_node1 18714 1726853425.32965: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.32977: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.32979: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.32982: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.34389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.35951: done with get_vars() 18714 1726853425.35978: done getting variables 18714 1726853425.36046: in VariableManager get_vars() 18714 1726853425.36059: Calling all_inventory to load vars for managed_node1 18714 1726853425.36062: Calling groups_inventory to load vars for managed_node1 18714 1726853425.36064: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.36069: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.36073: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.36076: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.37238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.38827: done with get_vars() 18714 1726853425.38858: done queuing things up, now waiting for results queue to drain 18714 1726853425.38860: results queue empty 18714 1726853425.38860: checking for any_errors_fatal 18714 1726853425.38864: done checking for any_errors_fatal 18714 1726853425.38865: checking for max_fail_percentage 18714 1726853425.38866: done checking for max_fail_percentage 18714 1726853425.38867: checking to see if all hosts have failed and the running result is not ok 18714 1726853425.38873: done checking to see if all hosts have failed 18714 1726853425.38874: getting the remaining hosts for this loop 18714 1726853425.38875: done getting the remaining hosts for this loop 18714 1726853425.38878: getting the next task for host managed_node1 18714 1726853425.38882: done getting next task for host managed_node1 18714 1726853425.38885: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18714 1726853425.38886: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853425.38895: getting variables 18714 1726853425.38896: in VariableManager get_vars() 18714 1726853425.38908: Calling all_inventory to load vars for managed_node1 18714 1726853425.38910: Calling groups_inventory to load vars for managed_node1 18714 1726853425.38912: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.38916: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.38918: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.38921: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.40083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.41621: done with get_vars() 18714 1726853425.41640: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:30:25 -0400 (0:00:00.993) 0:00:21.800 ****** 18714 1726853425.41715: entering _queue_task() for managed_node1/include_tasks 18714 1726853425.42073: worker is 1 (out of 1 available) 18714 1726853425.42086: exiting _queue_task() for managed_node1/include_tasks 18714 1726853425.42106: done queuing things up, now waiting for results queue to drain 18714 1726853425.42107: waiting for pending results... 18714 1726853425.42413: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18714 1726853425.42640: in run() - task 02083763-bbaf-e784-4f7d-00000000003c 18714 1726853425.42644: variable 'ansible_search_path' from source: unknown 18714 1726853425.42647: variable 'ansible_search_path' from source: unknown 18714 1726853425.42650: calling self._execute() 18714 1726853425.42708: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.42721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.42737: variable 'omit' from source: magic vars 18714 1726853425.43149: variable 'ansible_distribution_major_version' from source: facts 18714 1726853425.43167: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853425.43180: _execute() done 18714 1726853425.43201: dumping result to json 18714 1726853425.43276: done dumping result, returning 18714 1726853425.43279: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-e784-4f7d-00000000003c] 18714 1726853425.43281: sending task result for task 02083763-bbaf-e784-4f7d-00000000003c 18714 1726853425.43354: done sending task result for task 02083763-bbaf-e784-4f7d-00000000003c 18714 1726853425.43357: WORKER PROCESS EXITING 18714 1726853425.43401: no more pending results, returning what we have 18714 1726853425.43405: in VariableManager get_vars() 18714 1726853425.43450: Calling all_inventory to load vars for managed_node1 18714 1726853425.43453: Calling groups_inventory to load vars for managed_node1 18714 1726853425.43456: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.43469: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.43474: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.43477: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.45066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.46743: done with get_vars() 18714 1726853425.46761: variable 'ansible_search_path' from source: unknown 18714 1726853425.46762: variable 'ansible_search_path' from source: unknown 18714 1726853425.46793: we have included files to process 18714 1726853425.46794: generating all_blocks data 18714 1726853425.46796: done generating all_blocks data 18714 1726853425.46796: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853425.46798: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853425.46800: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853425.47375: done processing included file 18714 1726853425.47377: iterating over new_blocks loaded from include file 18714 1726853425.47379: in VariableManager get_vars() 18714 1726853425.47400: done with get_vars() 18714 1726853425.47402: filtering new block on tags 18714 1726853425.47418: done filtering new block on tags 18714 1726853425.47421: in VariableManager get_vars() 18714 1726853425.47441: done with get_vars() 18714 1726853425.47443: filtering new block on tags 18714 1726853425.47462: done filtering new block on tags 18714 1726853425.47465: in VariableManager get_vars() 18714 1726853425.47490: done with get_vars() 18714 1726853425.47492: filtering new block on tags 18714 1726853425.47509: done filtering new block on tags 18714 1726853425.47511: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18714 1726853425.47517: extending task lists for all hosts with included blocks 18714 1726853425.47904: done extending task lists 18714 1726853425.47905: done processing included files 18714 1726853425.47912: results queue empty 18714 1726853425.47912: checking for any_errors_fatal 18714 1726853425.47914: done checking for any_errors_fatal 18714 1726853425.47915: checking for max_fail_percentage 18714 1726853425.47916: done checking for max_fail_percentage 18714 1726853425.47916: checking to see if all hosts have failed and the running result is not ok 18714 1726853425.47917: done checking to see if all hosts have failed 18714 1726853425.47918: getting the remaining hosts for this loop 18714 1726853425.47919: done getting the remaining hosts for this loop 18714 1726853425.47922: getting the next task for host managed_node1 18714 1726853425.47925: done getting next task for host managed_node1 18714 1726853425.47928: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18714 1726853425.47930: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853425.47939: getting variables 18714 1726853425.47940: in VariableManager get_vars() 18714 1726853425.47953: Calling all_inventory to load vars for managed_node1 18714 1726853425.47956: Calling groups_inventory to load vars for managed_node1 18714 1726853425.47958: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.47963: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.47966: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.47968: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.53976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.56681: done with get_vars() 18714 1726853425.56705: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:30:25 -0400 (0:00:00.150) 0:00:21.951 ****** 18714 1726853425.56784: entering _queue_task() for managed_node1/setup 18714 1726853425.57522: worker is 1 (out of 1 available) 18714 1726853425.57535: exiting _queue_task() for managed_node1/setup 18714 1726853425.57547: done queuing things up, now waiting for results queue to drain 18714 1726853425.57548: waiting for pending results... 18714 1726853425.58588: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18714 1726853425.58593: in run() - task 02083763-bbaf-e784-4f7d-00000000036c 18714 1726853425.58597: variable 'ansible_search_path' from source: unknown 18714 1726853425.58599: variable 'ansible_search_path' from source: unknown 18714 1726853425.58777: calling self._execute() 18714 1726853425.58781: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.58784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.58787: variable 'omit' from source: magic vars 18714 1726853425.59399: variable 'ansible_distribution_major_version' from source: facts 18714 1726853425.59583: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853425.59797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853425.62086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853425.62165: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853425.62213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853425.62255: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853425.62289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853425.62376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853425.62416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853425.62445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853425.62496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853425.62526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853425.62588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853425.62624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853425.62655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853425.62700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853425.62722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853425.62906: variable '__network_required_facts' from source: role '' defaults 18714 1726853425.62921: variable 'ansible_facts' from source: unknown 18714 1726853425.63829: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18714 1726853425.63837: when evaluation is False, skipping this task 18714 1726853425.63844: _execute() done 18714 1726853425.63849: dumping result to json 18714 1726853425.63860: done dumping result, returning 18714 1726853425.63873: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-e784-4f7d-00000000036c] 18714 1726853425.63882: sending task result for task 02083763-bbaf-e784-4f7d-00000000036c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853425.64127: no more pending results, returning what we have 18714 1726853425.64132: results queue empty 18714 1726853425.64133: checking for any_errors_fatal 18714 1726853425.64135: done checking for any_errors_fatal 18714 1726853425.64136: checking for max_fail_percentage 18714 1726853425.64138: done checking for max_fail_percentage 18714 1726853425.64139: checking to see if all hosts have failed and the running result is not ok 18714 1726853425.64139: done checking to see if all hosts have failed 18714 1726853425.64140: getting the remaining hosts for this loop 18714 1726853425.64142: done getting the remaining hosts for this loop 18714 1726853425.64146: getting the next task for host managed_node1 18714 1726853425.64160: done getting next task for host managed_node1 18714 1726853425.64164: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18714 1726853425.64166: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853425.64182: getting variables 18714 1726853425.64184: in VariableManager get_vars() 18714 1726853425.64224: Calling all_inventory to load vars for managed_node1 18714 1726853425.64227: Calling groups_inventory to load vars for managed_node1 18714 1726853425.64230: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.64241: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.64244: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.64247: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.64884: done sending task result for task 02083763-bbaf-e784-4f7d-00000000036c 18714 1726853425.64887: WORKER PROCESS EXITING 18714 1726853425.66607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.70070: done with get_vars() 18714 1726853425.70125: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:30:25 -0400 (0:00:00.134) 0:00:22.086 ****** 18714 1726853425.70269: entering _queue_task() for managed_node1/stat 18714 1726853425.70627: worker is 1 (out of 1 available) 18714 1726853425.70641: exiting _queue_task() for managed_node1/stat 18714 1726853425.70769: done queuing things up, now waiting for results queue to drain 18714 1726853425.70772: waiting for pending results... 18714 1726853425.70960: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18714 1726853425.71136: in run() - task 02083763-bbaf-e784-4f7d-00000000036e 18714 1726853425.71162: variable 'ansible_search_path' from source: unknown 18714 1726853425.71173: variable 'ansible_search_path' from source: unknown 18714 1726853425.71220: calling self._execute() 18714 1726853425.71325: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.71336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.71348: variable 'omit' from source: magic vars 18714 1726853425.71760: variable 'ansible_distribution_major_version' from source: facts 18714 1726853425.71777: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853425.71944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853425.72231: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853425.72282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853425.72375: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853425.72420: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853425.72515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853425.72545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853425.72579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853425.72610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853425.72717: variable '__network_is_ostree' from source: set_fact 18714 1726853425.72737: Evaluated conditional (not __network_is_ostree is defined): False 18714 1726853425.72745: when evaluation is False, skipping this task 18714 1726853425.72756: _execute() done 18714 1726853425.72763: dumping result to json 18714 1726853425.72774: done dumping result, returning 18714 1726853425.72785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-e784-4f7d-00000000036e] 18714 1726853425.72793: sending task result for task 02083763-bbaf-e784-4f7d-00000000036e 18714 1726853425.73004: done sending task result for task 02083763-bbaf-e784-4f7d-00000000036e 18714 1726853425.73008: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18714 1726853425.73067: no more pending results, returning what we have 18714 1726853425.73073: results queue empty 18714 1726853425.73074: checking for any_errors_fatal 18714 1726853425.73080: done checking for any_errors_fatal 18714 1726853425.73081: checking for max_fail_percentage 18714 1726853425.73083: done checking for max_fail_percentage 18714 1726853425.73084: checking to see if all hosts have failed and the running result is not ok 18714 1726853425.73085: done checking to see if all hosts have failed 18714 1726853425.73086: getting the remaining hosts for this loop 18714 1726853425.73088: done getting the remaining hosts for this loop 18714 1726853425.73092: getting the next task for host managed_node1 18714 1726853425.73100: done getting next task for host managed_node1 18714 1726853425.73104: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18714 1726853425.73107: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853425.73346: getting variables 18714 1726853425.73348: in VariableManager get_vars() 18714 1726853425.73390: Calling all_inventory to load vars for managed_node1 18714 1726853425.73393: Calling groups_inventory to load vars for managed_node1 18714 1726853425.73395: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.73404: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.73407: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.73410: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.74801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.76462: done with get_vars() 18714 1726853425.76493: done getting variables 18714 1726853425.76553: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:30:25 -0400 (0:00:00.063) 0:00:22.149 ****** 18714 1726853425.76591: entering _queue_task() for managed_node1/set_fact 18714 1726853425.77189: worker is 1 (out of 1 available) 18714 1726853425.77200: exiting _queue_task() for managed_node1/set_fact 18714 1726853425.77211: done queuing things up, now waiting for results queue to drain 18714 1726853425.77211: waiting for pending results... 18714 1726853425.77500: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18714 1726853425.77706: in run() - task 02083763-bbaf-e784-4f7d-00000000036f 18714 1726853425.77710: variable 'ansible_search_path' from source: unknown 18714 1726853425.77713: variable 'ansible_search_path' from source: unknown 18714 1726853425.77716: calling self._execute() 18714 1726853425.77775: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.77786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.77800: variable 'omit' from source: magic vars 18714 1726853425.78181: variable 'ansible_distribution_major_version' from source: facts 18714 1726853425.78197: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853425.78356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853425.78624: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853425.78675: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853425.78715: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853425.78789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853425.78880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853425.78912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853425.78976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853425.78979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853425.79059: variable '__network_is_ostree' from source: set_fact 18714 1726853425.79070: Evaluated conditional (not __network_is_ostree is defined): False 18714 1726853425.79079: when evaluation is False, skipping this task 18714 1726853425.79086: _execute() done 18714 1726853425.79091: dumping result to json 18714 1726853425.79097: done dumping result, returning 18714 1726853425.79106: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-e784-4f7d-00000000036f] 18714 1726853425.79112: sending task result for task 02083763-bbaf-e784-4f7d-00000000036f 18714 1726853425.79297: done sending task result for task 02083763-bbaf-e784-4f7d-00000000036f 18714 1726853425.79300: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18714 1726853425.79385: no more pending results, returning what we have 18714 1726853425.79389: results queue empty 18714 1726853425.79390: checking for any_errors_fatal 18714 1726853425.79397: done checking for any_errors_fatal 18714 1726853425.79398: checking for max_fail_percentage 18714 1726853425.79400: done checking for max_fail_percentage 18714 1726853425.79401: checking to see if all hosts have failed and the running result is not ok 18714 1726853425.79402: done checking to see if all hosts have failed 18714 1726853425.79403: getting the remaining hosts for this loop 18714 1726853425.79404: done getting the remaining hosts for this loop 18714 1726853425.79408: getting the next task for host managed_node1 18714 1726853425.79419: done getting next task for host managed_node1 18714 1726853425.79424: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18714 1726853425.79426: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853425.79441: getting variables 18714 1726853425.79443: in VariableManager get_vars() 18714 1726853425.79486: Calling all_inventory to load vars for managed_node1 18714 1726853425.79489: Calling groups_inventory to load vars for managed_node1 18714 1726853425.79492: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853425.79503: Calling all_plugins_play to load vars for managed_node1 18714 1726853425.79505: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853425.79508: Calling groups_plugins_play to load vars for managed_node1 18714 1726853425.83059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853425.86253: done with get_vars() 18714 1726853425.86477: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:30:25 -0400 (0:00:00.099) 0:00:22.249 ****** 18714 1726853425.86578: entering _queue_task() for managed_node1/service_facts 18714 1726853425.86930: worker is 1 (out of 1 available) 18714 1726853425.86943: exiting _queue_task() for managed_node1/service_facts 18714 1726853425.86957: done queuing things up, now waiting for results queue to drain 18714 1726853425.86958: waiting for pending results... 18714 1726853425.87300: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18714 1726853425.87577: in run() - task 02083763-bbaf-e784-4f7d-000000000371 18714 1726853425.87581: variable 'ansible_search_path' from source: unknown 18714 1726853425.87583: variable 'ansible_search_path' from source: unknown 18714 1726853425.87586: calling self._execute() 18714 1726853425.87589: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.87592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.87595: variable 'omit' from source: magic vars 18714 1726853425.87978: variable 'ansible_distribution_major_version' from source: facts 18714 1726853425.87992: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853425.88001: variable 'omit' from source: magic vars 18714 1726853425.88061: variable 'omit' from source: magic vars 18714 1726853425.88099: variable 'omit' from source: magic vars 18714 1726853425.88140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853425.88183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853425.88258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853425.88262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853425.88264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853425.88283: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853425.88291: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.88298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.88404: Set connection var ansible_shell_executable to /bin/sh 18714 1726853425.88415: Set connection var ansible_timeout to 10 18714 1726853425.88423: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853425.88433: Set connection var ansible_connection to ssh 18714 1726853425.88443: Set connection var ansible_shell_type to sh 18714 1726853425.88455: Set connection var ansible_pipelining to False 18714 1726853425.88576: variable 'ansible_shell_executable' from source: unknown 18714 1726853425.88581: variable 'ansible_connection' from source: unknown 18714 1726853425.88584: variable 'ansible_module_compression' from source: unknown 18714 1726853425.88586: variable 'ansible_shell_type' from source: unknown 18714 1726853425.88589: variable 'ansible_shell_executable' from source: unknown 18714 1726853425.88590: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853425.88592: variable 'ansible_pipelining' from source: unknown 18714 1726853425.88594: variable 'ansible_timeout' from source: unknown 18714 1726853425.88596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853425.88724: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853425.88775: variable 'omit' from source: magic vars 18714 1726853425.88778: starting attempt loop 18714 1726853425.88780: running the handler 18714 1726853425.88782: _low_level_execute_command(): starting 18714 1726853425.88784: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853425.89705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853425.89721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853425.89791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853425.89836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853425.89853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853425.89884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853425.90170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853425.91790: stdout chunk (state=3): >>>/root <<< 18714 1726853425.91922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853425.91932: stdout chunk (state=3): >>><<< 18714 1726853425.91942: stderr chunk (state=3): >>><<< 18714 1726853425.92131: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853425.92134: _low_level_execute_command(): starting 18714 1726853425.92137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052 `" && echo ansible-tmp-1726853425.920445-19735-244920476833052="` echo /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052 `" ) && sleep 0' 18714 1726853425.92694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853425.92711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853425.92907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853425.92960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853425.92999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853425.93039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853425.94932: stdout chunk (state=3): >>>ansible-tmp-1726853425.920445-19735-244920476833052=/root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052 <<< 18714 1726853425.95162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853425.95166: stdout chunk (state=3): >>><<< 18714 1726853425.95168: stderr chunk (state=3): >>><<< 18714 1726853425.95189: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853425.920445-19735-244920476833052=/root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853425.95414: variable 'ansible_module_compression' from source: unknown 18714 1726853425.95418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18714 1726853425.95420: variable 'ansible_facts' from source: unknown 18714 1726853425.95467: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py 18714 1726853425.95648: Sending initial data 18714 1726853425.95680: Sent initial data (161 bytes) 18714 1726853425.96387: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853425.96457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853425.96485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853425.96504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853425.96633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853425.98115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853425.98194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853425.98294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmprn4torws /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py <<< 18714 1726853425.98303: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py" <<< 18714 1726853425.98332: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmprn4torws" to remote "/root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py" <<< 18714 1726853426.00082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853426.00086: stdout chunk (state=3): >>><<< 18714 1726853426.00089: stderr chunk (state=3): >>><<< 18714 1726853426.00128: done transferring module to remote 18714 1726853426.00145: _low_level_execute_command(): starting 18714 1726853426.00159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/ /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py && sleep 0' 18714 1726853426.01403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853426.01417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853426.01437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853426.01482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853426.01726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853426.01762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853426.03794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853426.03798: stdout chunk (state=3): >>><<< 18714 1726853426.03805: stderr chunk (state=3): >>><<< 18714 1726853426.03808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853426.03810: _low_level_execute_command(): starting 18714 1726853426.03813: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/AnsiballZ_service_facts.py && sleep 0' 18714 1726853426.04761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853426.04770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853426.04783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853426.04800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853426.04813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853426.05074: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853426.05079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853426.05266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853427.57555: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 18714 1726853427.57568: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 18714 1726853427.57602: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 18714 1726853427.57619: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 18714 1726853427.57641: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18714 1726853427.59098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853427.59129: stderr chunk (state=3): >>><<< 18714 1726853427.59133: stdout chunk (state=3): >>><<< 18714 1726853427.59168: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853427.59607: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853427.59616: _low_level_execute_command(): starting 18714 1726853427.59619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853425.920445-19735-244920476833052/ > /dev/null 2>&1 && sleep 0' 18714 1726853427.60086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853427.60091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853427.60095: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.60097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853427.60099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853427.60101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.60148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853427.60151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853427.60153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853427.60196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853427.62008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853427.62037: stderr chunk (state=3): >>><<< 18714 1726853427.62040: stdout chunk (state=3): >>><<< 18714 1726853427.62053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853427.62062: handler run complete 18714 1726853427.62178: variable 'ansible_facts' from source: unknown 18714 1726853427.62274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853427.62546: variable 'ansible_facts' from source: unknown 18714 1726853427.62631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853427.62744: attempt loop complete, returning result 18714 1726853427.62747: _execute() done 18714 1726853427.62752: dumping result to json 18714 1726853427.62793: done dumping result, returning 18714 1726853427.62802: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-e784-4f7d-000000000371] 18714 1726853427.62804: sending task result for task 02083763-bbaf-e784-4f7d-000000000371 18714 1726853427.63524: done sending task result for task 02083763-bbaf-e784-4f7d-000000000371 18714 1726853427.63528: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853427.63596: no more pending results, returning what we have 18714 1726853427.63599: results queue empty 18714 1726853427.63600: checking for any_errors_fatal 18714 1726853427.63603: done checking for any_errors_fatal 18714 1726853427.63604: checking for max_fail_percentage 18714 1726853427.63606: done checking for max_fail_percentage 18714 1726853427.63606: checking to see if all hosts have failed and the running result is not ok 18714 1726853427.63607: done checking to see if all hosts have failed 18714 1726853427.63608: getting the remaining hosts for this loop 18714 1726853427.63609: done getting the remaining hosts for this loop 18714 1726853427.63612: getting the next task for host managed_node1 18714 1726853427.63618: done getting next task for host managed_node1 18714 1726853427.63621: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18714 1726853427.63623: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853427.63629: getting variables 18714 1726853427.63630: in VariableManager get_vars() 18714 1726853427.63652: Calling all_inventory to load vars for managed_node1 18714 1726853427.63654: Calling groups_inventory to load vars for managed_node1 18714 1726853427.63656: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853427.63662: Calling all_plugins_play to load vars for managed_node1 18714 1726853427.63664: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853427.63665: Calling groups_plugins_play to load vars for managed_node1 18714 1726853427.64524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853427.66345: done with get_vars() 18714 1726853427.66379: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:30:27 -0400 (0:00:01.799) 0:00:24.048 ****** 18714 1726853427.66539: entering _queue_task() for managed_node1/package_facts 18714 1726853427.67104: worker is 1 (out of 1 available) 18714 1726853427.67135: exiting _queue_task() for managed_node1/package_facts 18714 1726853427.67155: done queuing things up, now waiting for results queue to drain 18714 1726853427.67157: waiting for pending results... 18714 1726853427.67663: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18714 1726853427.67887: in run() - task 02083763-bbaf-e784-4f7d-000000000372 18714 1726853427.67934: variable 'ansible_search_path' from source: unknown 18714 1726853427.67986: variable 'ansible_search_path' from source: unknown 18714 1726853427.68054: calling self._execute() 18714 1726853427.68357: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853427.68366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853427.68373: variable 'omit' from source: magic vars 18714 1726853427.69027: variable 'ansible_distribution_major_version' from source: facts 18714 1726853427.69069: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853427.69092: variable 'omit' from source: magic vars 18714 1726853427.69241: variable 'omit' from source: magic vars 18714 1726853427.69367: variable 'omit' from source: magic vars 18714 1726853427.69414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853427.69506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853427.69595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853427.69605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853427.69617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853427.69672: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853427.69685: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853427.69690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853427.69833: Set connection var ansible_shell_executable to /bin/sh 18714 1726853427.69837: Set connection var ansible_timeout to 10 18714 1726853427.69845: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853427.69854: Set connection var ansible_connection to ssh 18714 1726853427.69863: Set connection var ansible_shell_type to sh 18714 1726853427.69875: Set connection var ansible_pipelining to False 18714 1726853427.69899: variable 'ansible_shell_executable' from source: unknown 18714 1726853427.69902: variable 'ansible_connection' from source: unknown 18714 1726853427.69915: variable 'ansible_module_compression' from source: unknown 18714 1726853427.69918: variable 'ansible_shell_type' from source: unknown 18714 1726853427.69922: variable 'ansible_shell_executable' from source: unknown 18714 1726853427.69924: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853427.69926: variable 'ansible_pipelining' from source: unknown 18714 1726853427.69928: variable 'ansible_timeout' from source: unknown 18714 1726853427.69949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853427.70376: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853427.70380: variable 'omit' from source: magic vars 18714 1726853427.70384: starting attempt loop 18714 1726853427.70387: running the handler 18714 1726853427.70388: _low_level_execute_command(): starting 18714 1726853427.70390: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853427.71096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.71129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853427.71140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853427.71162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853427.71250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853427.73064: stdout chunk (state=3): >>>/root <<< 18714 1726853427.73079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853427.73094: stdout chunk (state=3): >>><<< 18714 1726853427.73133: stderr chunk (state=3): >>><<< 18714 1726853427.73160: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853427.73250: _low_level_execute_command(): starting 18714 1726853427.73268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988 `" && echo ansible-tmp-1726853427.732348-19830-103554208297988="` echo /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988 `" ) && sleep 0' 18714 1726853427.74602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853427.74605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.74608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853427.74616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853427.74618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.74655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853427.74674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853427.74890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853427.75002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853427.76943: stdout chunk (state=3): >>>ansible-tmp-1726853427.732348-19830-103554208297988=/root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988 <<< 18714 1726853427.77043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853427.77078: stderr chunk (state=3): >>><<< 18714 1726853427.77110: stdout chunk (state=3): >>><<< 18714 1726853427.77192: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853427.732348-19830-103554208297988=/root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853427.77250: variable 'ansible_module_compression' from source: unknown 18714 1726853427.77397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18714 1726853427.77549: variable 'ansible_facts' from source: unknown 18714 1726853427.77973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py 18714 1726853427.78313: Sending initial data 18714 1726853427.78325: Sent initial data (161 bytes) 18714 1726853427.79690: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.79810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853427.79840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853427.79973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853427.81679: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853427.81792: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853427.81809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py" <<< 18714 1726853427.81815: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp8k4gcjmq /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py <<< 18714 1726853427.81861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp8k4gcjmq" to remote "/root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py" <<< 18714 1726853427.84683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853427.84687: stderr chunk (state=3): >>><<< 18714 1726853427.84689: stdout chunk (state=3): >>><<< 18714 1726853427.84691: done transferring module to remote 18714 1726853427.84693: _low_level_execute_command(): starting 18714 1726853427.84695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/ /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py && sleep 0' 18714 1726853427.85831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.85862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853427.85878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853427.86174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853427.86208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853427.88029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853427.88067: stderr chunk (state=3): >>><<< 18714 1726853427.88078: stdout chunk (state=3): >>><<< 18714 1726853427.88123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853427.88216: _low_level_execute_command(): starting 18714 1726853427.88219: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/AnsiballZ_package_facts.py && sleep 0' 18714 1726853427.89213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853427.89216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853427.89218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853427.89221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853427.89497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853427.89910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853427.89986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853428.34027: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 18714 1726853428.34190: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 18714 1726853428.34412: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18714 1726853428.36059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853428.36062: stdout chunk (state=3): >>><<< 18714 1726853428.36065: stderr chunk (state=3): >>><<< 18714 1726853428.36205: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853428.41062: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853428.41066: _low_level_execute_command(): starting 18714 1726853428.41069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853427.732348-19830-103554208297988/ > /dev/null 2>&1 && sleep 0' 18714 1726853428.42265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853428.42282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853428.42296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853428.42388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853428.42485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853428.42582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853428.42619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853428.42662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853428.44633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853428.44637: stdout chunk (state=3): >>><<< 18714 1726853428.44639: stderr chunk (state=3): >>><<< 18714 1726853428.44658: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853428.44782: handler run complete 18714 1726853428.46456: variable 'ansible_facts' from source: unknown 18714 1726853428.47491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853428.51504: variable 'ansible_facts' from source: unknown 18714 1726853428.52629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853428.54223: attempt loop complete, returning result 18714 1726853428.54227: _execute() done 18714 1726853428.54230: dumping result to json 18714 1726853428.54556: done dumping result, returning 18714 1726853428.54775: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-e784-4f7d-000000000372] 18714 1726853428.54779: sending task result for task 02083763-bbaf-e784-4f7d-000000000372 18714 1726853428.59033: done sending task result for task 02083763-bbaf-e784-4f7d-000000000372 18714 1726853428.59039: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853428.59189: no more pending results, returning what we have 18714 1726853428.59192: results queue empty 18714 1726853428.59193: checking for any_errors_fatal 18714 1726853428.59198: done checking for any_errors_fatal 18714 1726853428.59199: checking for max_fail_percentage 18714 1726853428.59201: done checking for max_fail_percentage 18714 1726853428.59201: checking to see if all hosts have failed and the running result is not ok 18714 1726853428.59202: done checking to see if all hosts have failed 18714 1726853428.59203: getting the remaining hosts for this loop 18714 1726853428.59204: done getting the remaining hosts for this loop 18714 1726853428.59207: getting the next task for host managed_node1 18714 1726853428.59214: done getting next task for host managed_node1 18714 1726853428.59218: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18714 1726853428.59220: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853428.59230: getting variables 18714 1726853428.59232: in VariableManager get_vars() 18714 1726853428.59260: Calling all_inventory to load vars for managed_node1 18714 1726853428.59263: Calling groups_inventory to load vars for managed_node1 18714 1726853428.59265: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853428.59281: Calling all_plugins_play to load vars for managed_node1 18714 1726853428.59284: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853428.59290: Calling groups_plugins_play to load vars for managed_node1 18714 1726853428.61948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853428.65390: done with get_vars() 18714 1726853428.65495: done getting variables 18714 1726853428.65564: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:30:28 -0400 (0:00:00.992) 0:00:25.041 ****** 18714 1726853428.65757: entering _queue_task() for managed_node1/debug 18714 1726853428.66372: worker is 1 (out of 1 available) 18714 1726853428.66500: exiting _queue_task() for managed_node1/debug 18714 1726853428.66512: done queuing things up, now waiting for results queue to drain 18714 1726853428.66514: waiting for pending results... 18714 1726853428.67297: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18714 1726853428.67354: in run() - task 02083763-bbaf-e784-4f7d-00000000003d 18714 1726853428.67395: variable 'ansible_search_path' from source: unknown 18714 1726853428.67401: variable 'ansible_search_path' from source: unknown 18714 1726853428.67726: calling self._execute() 18714 1726853428.67877: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853428.67881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853428.67884: variable 'omit' from source: magic vars 18714 1726853428.68885: variable 'ansible_distribution_major_version' from source: facts 18714 1726853428.68897: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853428.68908: variable 'omit' from source: magic vars 18714 1726853428.69310: variable 'omit' from source: magic vars 18714 1726853428.69314: variable 'network_provider' from source: set_fact 18714 1726853428.69318: variable 'omit' from source: magic vars 18714 1726853428.69436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853428.69480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853428.69557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853428.69661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853428.69680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853428.69885: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853428.69889: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853428.69892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853428.70479: Set connection var ansible_shell_executable to /bin/sh 18714 1726853428.70483: Set connection var ansible_timeout to 10 18714 1726853428.70486: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853428.70489: Set connection var ansible_connection to ssh 18714 1726853428.70491: Set connection var ansible_shell_type to sh 18714 1726853428.70493: Set connection var ansible_pipelining to False 18714 1726853428.70495: variable 'ansible_shell_executable' from source: unknown 18714 1726853428.70498: variable 'ansible_connection' from source: unknown 18714 1726853428.70500: variable 'ansible_module_compression' from source: unknown 18714 1726853428.70502: variable 'ansible_shell_type' from source: unknown 18714 1726853428.70504: variable 'ansible_shell_executable' from source: unknown 18714 1726853428.70506: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853428.70508: variable 'ansible_pipelining' from source: unknown 18714 1726853428.70509: variable 'ansible_timeout' from source: unknown 18714 1726853428.70512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853428.71094: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853428.71098: variable 'omit' from source: magic vars 18714 1726853428.71101: starting attempt loop 18714 1726853428.71103: running the handler 18714 1726853428.71111: handler run complete 18714 1726853428.71113: attempt loop complete, returning result 18714 1726853428.71115: _execute() done 18714 1726853428.71118: dumping result to json 18714 1726853428.71120: done dumping result, returning 18714 1726853428.71122: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-e784-4f7d-00000000003d] 18714 1726853428.71124: sending task result for task 02083763-bbaf-e784-4f7d-00000000003d 18714 1726853428.71638: done sending task result for task 02083763-bbaf-e784-4f7d-00000000003d 18714 1726853428.71641: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 18714 1726853428.71700: no more pending results, returning what we have 18714 1726853428.71703: results queue empty 18714 1726853428.71705: checking for any_errors_fatal 18714 1726853428.71712: done checking for any_errors_fatal 18714 1726853428.71713: checking for max_fail_percentage 18714 1726853428.71715: done checking for max_fail_percentage 18714 1726853428.71716: checking to see if all hosts have failed and the running result is not ok 18714 1726853428.71717: done checking to see if all hosts have failed 18714 1726853428.71717: getting the remaining hosts for this loop 18714 1726853428.71719: done getting the remaining hosts for this loop 18714 1726853428.71723: getting the next task for host managed_node1 18714 1726853428.71728: done getting next task for host managed_node1 18714 1726853428.71732: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18714 1726853428.71734: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853428.71744: getting variables 18714 1726853428.71746: in VariableManager get_vars() 18714 1726853428.71788: Calling all_inventory to load vars for managed_node1 18714 1726853428.71791: Calling groups_inventory to load vars for managed_node1 18714 1726853428.71794: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853428.71804: Calling all_plugins_play to load vars for managed_node1 18714 1726853428.71807: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853428.71810: Calling groups_plugins_play to load vars for managed_node1 18714 1726853428.75199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853428.79087: done with get_vars() 18714 1726853428.79121: done getting variables 18714 1726853428.79775: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:30:28 -0400 (0:00:00.140) 0:00:25.181 ****** 18714 1726853428.79813: entering _queue_task() for managed_node1/fail 18714 1726853428.80986: worker is 1 (out of 1 available) 18714 1726853428.81000: exiting _queue_task() for managed_node1/fail 18714 1726853428.81013: done queuing things up, now waiting for results queue to drain 18714 1726853428.81014: waiting for pending results... 18714 1726853428.81994: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18714 1726853428.82021: in run() - task 02083763-bbaf-e784-4f7d-00000000003e 18714 1726853428.82050: variable 'ansible_search_path' from source: unknown 18714 1726853428.82063: variable 'ansible_search_path' from source: unknown 18714 1726853428.82111: calling self._execute() 18714 1726853428.82229: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853428.82241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853428.82266: variable 'omit' from source: magic vars 18714 1726853428.82679: variable 'ansible_distribution_major_version' from source: facts 18714 1726853428.82701: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853428.82828: variable 'network_state' from source: role '' defaults 18714 1726853428.82846: Evaluated conditional (network_state != {}): False 18714 1726853428.82858: when evaluation is False, skipping this task 18714 1726853428.82867: _execute() done 18714 1726853428.82876: dumping result to json 18714 1726853428.82883: done dumping result, returning 18714 1726853428.82894: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-e784-4f7d-00000000003e] 18714 1726853428.82902: sending task result for task 02083763-bbaf-e784-4f7d-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853428.83069: no more pending results, returning what we have 18714 1726853428.83075: results queue empty 18714 1726853428.83077: checking for any_errors_fatal 18714 1726853428.83084: done checking for any_errors_fatal 18714 1726853428.83085: checking for max_fail_percentage 18714 1726853428.83087: done checking for max_fail_percentage 18714 1726853428.83088: checking to see if all hosts have failed and the running result is not ok 18714 1726853428.83089: done checking to see if all hosts have failed 18714 1726853428.83090: getting the remaining hosts for this loop 18714 1726853428.83091: done getting the remaining hosts for this loop 18714 1726853428.83097: getting the next task for host managed_node1 18714 1726853428.83103: done getting next task for host managed_node1 18714 1726853428.83107: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18714 1726853428.83110: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853428.83131: getting variables 18714 1726853428.83133: in VariableManager get_vars() 18714 1726853428.83241: Calling all_inventory to load vars for managed_node1 18714 1726853428.83244: Calling groups_inventory to load vars for managed_node1 18714 1726853428.83246: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853428.83254: done sending task result for task 02083763-bbaf-e784-4f7d-00000000003e 18714 1726853428.83257: WORKER PROCESS EXITING 18714 1726853428.83269: Calling all_plugins_play to load vars for managed_node1 18714 1726853428.83274: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853428.83277: Calling groups_plugins_play to load vars for managed_node1 18714 1726853428.86011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853428.88348: done with get_vars() 18714 1726853428.88386: done getting variables 18714 1726853428.88444: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:30:28 -0400 (0:00:00.090) 0:00:25.272 ****** 18714 1726853428.88886: entering _queue_task() for managed_node1/fail 18714 1726853428.89749: worker is 1 (out of 1 available) 18714 1726853428.89766: exiting _queue_task() for managed_node1/fail 18714 1726853428.90082: done queuing things up, now waiting for results queue to drain 18714 1726853428.90083: waiting for pending results... 18714 1726853428.90364: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18714 1726853428.90542: in run() - task 02083763-bbaf-e784-4f7d-00000000003f 18714 1726853428.90678: variable 'ansible_search_path' from source: unknown 18714 1726853428.90686: variable 'ansible_search_path' from source: unknown 18714 1726853428.90688: calling self._execute() 18714 1726853428.90900: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853428.90914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853428.90977: variable 'omit' from source: magic vars 18714 1726853428.91793: variable 'ansible_distribution_major_version' from source: facts 18714 1726853428.91874: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853428.92091: variable 'network_state' from source: role '' defaults 18714 1726853428.92109: Evaluated conditional (network_state != {}): False 18714 1726853428.92194: when evaluation is False, skipping this task 18714 1726853428.92197: _execute() done 18714 1726853428.92200: dumping result to json 18714 1726853428.92202: done dumping result, returning 18714 1726853428.92205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-e784-4f7d-00000000003f] 18714 1726853428.92208: sending task result for task 02083763-bbaf-e784-4f7d-00000000003f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853428.92525: no more pending results, returning what we have 18714 1726853428.92530: results queue empty 18714 1726853428.92531: checking for any_errors_fatal 18714 1726853428.92539: done checking for any_errors_fatal 18714 1726853428.92540: checking for max_fail_percentage 18714 1726853428.92542: done checking for max_fail_percentage 18714 1726853428.92544: checking to see if all hosts have failed and the running result is not ok 18714 1726853428.92545: done checking to see if all hosts have failed 18714 1726853428.92545: getting the remaining hosts for this loop 18714 1726853428.92547: done getting the remaining hosts for this loop 18714 1726853428.92551: getting the next task for host managed_node1 18714 1726853428.92558: done getting next task for host managed_node1 18714 1726853428.92562: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18714 1726853428.92565: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853428.92690: done sending task result for task 02083763-bbaf-e784-4f7d-00000000003f 18714 1726853428.92693: WORKER PROCESS EXITING 18714 1726853428.92704: getting variables 18714 1726853428.92706: in VariableManager get_vars() 18714 1726853428.92860: Calling all_inventory to load vars for managed_node1 18714 1726853428.92863: Calling groups_inventory to load vars for managed_node1 18714 1726853428.92866: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853428.93016: Calling all_plugins_play to load vars for managed_node1 18714 1726853428.93020: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853428.93024: Calling groups_plugins_play to load vars for managed_node1 18714 1726853428.96197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853428.99921: done with get_vars() 18714 1726853428.99945: done getting variables 18714 1726853429.00087: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:30:29 -0400 (0:00:00.113) 0:00:25.385 ****** 18714 1726853429.00234: entering _queue_task() for managed_node1/fail 18714 1726853429.00903: worker is 1 (out of 1 available) 18714 1726853429.00917: exiting _queue_task() for managed_node1/fail 18714 1726853429.00929: done queuing things up, now waiting for results queue to drain 18714 1726853429.00930: waiting for pending results... 18714 1726853429.01433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18714 1726853429.01651: in run() - task 02083763-bbaf-e784-4f7d-000000000040 18714 1726853429.01745: variable 'ansible_search_path' from source: unknown 18714 1726853429.01749: variable 'ansible_search_path' from source: unknown 18714 1726853429.01877: calling self._execute() 18714 1726853429.02071: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.02084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.02097: variable 'omit' from source: magic vars 18714 1726853429.03020: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.03023: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.03309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853429.08155: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853429.08353: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853429.08398: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853429.08493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853429.08545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853429.08714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.08823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.08914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.09007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.09081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.09321: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.09324: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18714 1726853429.09608: variable 'ansible_distribution' from source: facts 18714 1726853429.09649: variable '__network_rh_distros' from source: role '' defaults 18714 1726853429.09653: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18714 1726853429.10376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.10381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.10384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.10387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.10389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.10441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.10533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.10724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.10728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.10730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.10794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.10858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.10967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.11011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.11067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.11724: variable 'network_connections' from source: play vars 18714 1726853429.11790: variable 'profile' from source: play vars 18714 1726853429.12025: variable 'profile' from source: play vars 18714 1726853429.12028: variable 'interface' from source: set_fact 18714 1726853429.12061: variable 'interface' from source: set_fact 18714 1726853429.12145: variable 'network_state' from source: role '' defaults 18714 1726853429.12215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853429.12634: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853429.12811: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853429.12841: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853429.12872: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853429.13136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853429.13160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853429.13358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.13385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853429.13409: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18714 1726853429.13413: when evaluation is False, skipping this task 18714 1726853429.13415: _execute() done 18714 1726853429.13418: dumping result to json 18714 1726853429.13420: done dumping result, returning 18714 1726853429.13434: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-e784-4f7d-000000000040] 18714 1726853429.13436: sending task result for task 02083763-bbaf-e784-4f7d-000000000040 18714 1726853429.13745: done sending task result for task 02083763-bbaf-e784-4f7d-000000000040 18714 1726853429.13864: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18714 1726853429.13917: no more pending results, returning what we have 18714 1726853429.13921: results queue empty 18714 1726853429.13922: checking for any_errors_fatal 18714 1726853429.13930: done checking for any_errors_fatal 18714 1726853429.13930: checking for max_fail_percentage 18714 1726853429.13932: done checking for max_fail_percentage 18714 1726853429.13933: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.13934: done checking to see if all hosts have failed 18714 1726853429.13935: getting the remaining hosts for this loop 18714 1726853429.13936: done getting the remaining hosts for this loop 18714 1726853429.13940: getting the next task for host managed_node1 18714 1726853429.13946: done getting next task for host managed_node1 18714 1726853429.13951: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18714 1726853429.13953: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.13969: getting variables 18714 1726853429.13972: in VariableManager get_vars() 18714 1726853429.14012: Calling all_inventory to load vars for managed_node1 18714 1726853429.14015: Calling groups_inventory to load vars for managed_node1 18714 1726853429.14018: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.14028: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.14031: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.14034: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.17804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853429.21349: done with get_vars() 18714 1726853429.21446: done getting variables 18714 1726853429.21567: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:30:29 -0400 (0:00:00.214) 0:00:25.600 ****** 18714 1726853429.21667: entering _queue_task() for managed_node1/dnf 18714 1726853429.22491: worker is 1 (out of 1 available) 18714 1726853429.22668: exiting _queue_task() for managed_node1/dnf 18714 1726853429.22680: done queuing things up, now waiting for results queue to drain 18714 1726853429.22681: waiting for pending results... 18714 1726853429.23012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18714 1726853429.23437: in run() - task 02083763-bbaf-e784-4f7d-000000000041 18714 1726853429.23441: variable 'ansible_search_path' from source: unknown 18714 1726853429.23443: variable 'ansible_search_path' from source: unknown 18714 1726853429.23445: calling self._execute() 18714 1726853429.23598: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.23608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.23669: variable 'omit' from source: magic vars 18714 1726853429.24710: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.24892: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.25423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853429.30230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853429.30325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853429.30370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853429.30430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853429.30463: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853429.30588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.30641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.30675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.30724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.30752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.30887: variable 'ansible_distribution' from source: facts 18714 1726853429.30902: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.30925: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18714 1726853429.31075: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853429.31248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.31251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.31255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.31307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.31328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.31379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.31410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.31448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.31516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.31521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.31577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.31599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.31635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.31692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.31734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.31901: variable 'network_connections' from source: play vars 18714 1726853429.31953: variable 'profile' from source: play vars 18714 1726853429.32006: variable 'profile' from source: play vars 18714 1726853429.32018: variable 'interface' from source: set_fact 18714 1726853429.32085: variable 'interface' from source: set_fact 18714 1726853429.32331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853429.32790: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853429.32812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853429.32914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853429.33205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853429.33260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853429.33288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853429.33375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.33484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853429.33704: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853429.35328: variable 'network_connections' from source: play vars 18714 1726853429.35352: variable 'profile' from source: play vars 18714 1726853429.35488: variable 'profile' from source: play vars 18714 1726853429.35491: variable 'interface' from source: set_fact 18714 1726853429.35884: variable 'interface' from source: set_fact 18714 1726853429.35887: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853429.35890: when evaluation is False, skipping this task 18714 1726853429.35892: _execute() done 18714 1726853429.35894: dumping result to json 18714 1726853429.35896: done dumping result, returning 18714 1726853429.35898: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000041] 18714 1726853429.35901: sending task result for task 02083763-bbaf-e784-4f7d-000000000041 18714 1726853429.35968: done sending task result for task 02083763-bbaf-e784-4f7d-000000000041 18714 1726853429.35973: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853429.36037: no more pending results, returning what we have 18714 1726853429.36041: results queue empty 18714 1726853429.36042: checking for any_errors_fatal 18714 1726853429.36051: done checking for any_errors_fatal 18714 1726853429.36052: checking for max_fail_percentage 18714 1726853429.36054: done checking for max_fail_percentage 18714 1726853429.36055: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.36056: done checking to see if all hosts have failed 18714 1726853429.36057: getting the remaining hosts for this loop 18714 1726853429.36058: done getting the remaining hosts for this loop 18714 1726853429.36063: getting the next task for host managed_node1 18714 1726853429.36070: done getting next task for host managed_node1 18714 1726853429.36076: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18714 1726853429.36078: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.36094: getting variables 18714 1726853429.36096: in VariableManager get_vars() 18714 1726853429.36136: Calling all_inventory to load vars for managed_node1 18714 1726853429.36139: Calling groups_inventory to load vars for managed_node1 18714 1726853429.36142: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.36153: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.36156: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.36160: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.40785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853429.44174: done with get_vars() 18714 1726853429.44205: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18714 1726853429.44299: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:30:29 -0400 (0:00:00.226) 0:00:25.826 ****** 18714 1726853429.44333: entering _queue_task() for managed_node1/yum 18714 1726853429.44730: worker is 1 (out of 1 available) 18714 1726853429.44743: exiting _queue_task() for managed_node1/yum 18714 1726853429.44755: done queuing things up, now waiting for results queue to drain 18714 1726853429.44756: waiting for pending results... 18714 1726853429.45078: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18714 1726853429.45215: in run() - task 02083763-bbaf-e784-4f7d-000000000042 18714 1726853429.45244: variable 'ansible_search_path' from source: unknown 18714 1726853429.45253: variable 'ansible_search_path' from source: unknown 18714 1726853429.45296: calling self._execute() 18714 1726853429.45404: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.45426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.45525: variable 'omit' from source: magic vars 18714 1726853429.45883: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.45901: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.46114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853429.48607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853429.48678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853429.48728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853429.48766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853429.48804: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853429.48900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.48957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.48992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.49049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.49076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.49221: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.49224: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18714 1726853429.49226: when evaluation is False, skipping this task 18714 1726853429.49229: _execute() done 18714 1726853429.49231: dumping result to json 18714 1726853429.49233: done dumping result, returning 18714 1726853429.49235: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000042] 18714 1726853429.49238: sending task result for task 02083763-bbaf-e784-4f7d-000000000042 18714 1726853429.49597: done sending task result for task 02083763-bbaf-e784-4f7d-000000000042 18714 1726853429.49600: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18714 1726853429.49652: no more pending results, returning what we have 18714 1726853429.49655: results queue empty 18714 1726853429.49656: checking for any_errors_fatal 18714 1726853429.49663: done checking for any_errors_fatal 18714 1726853429.49664: checking for max_fail_percentage 18714 1726853429.49665: done checking for max_fail_percentage 18714 1726853429.49666: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.49667: done checking to see if all hosts have failed 18714 1726853429.49668: getting the remaining hosts for this loop 18714 1726853429.49669: done getting the remaining hosts for this loop 18714 1726853429.49675: getting the next task for host managed_node1 18714 1726853429.49680: done getting next task for host managed_node1 18714 1726853429.49684: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18714 1726853429.49686: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.49700: getting variables 18714 1726853429.49701: in VariableManager get_vars() 18714 1726853429.49742: Calling all_inventory to load vars for managed_node1 18714 1726853429.49745: Calling groups_inventory to load vars for managed_node1 18714 1726853429.49748: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.49757: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.49760: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.49763: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.51353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853429.53124: done with get_vars() 18714 1726853429.53154: done getting variables 18714 1726853429.53220: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:30:29 -0400 (0:00:00.089) 0:00:25.916 ****** 18714 1726853429.53263: entering _queue_task() for managed_node1/fail 18714 1726853429.53626: worker is 1 (out of 1 available) 18714 1726853429.53640: exiting _queue_task() for managed_node1/fail 18714 1726853429.53883: done queuing things up, now waiting for results queue to drain 18714 1726853429.53884: waiting for pending results... 18714 1726853429.53941: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18714 1726853429.54055: in run() - task 02083763-bbaf-e784-4f7d-000000000043 18714 1726853429.54078: variable 'ansible_search_path' from source: unknown 18714 1726853429.54086: variable 'ansible_search_path' from source: unknown 18714 1726853429.54140: calling self._execute() 18714 1726853429.54253: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.54275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.54282: variable 'omit' from source: magic vars 18714 1726853429.54760: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.54764: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.54841: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853429.55070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853429.57835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853429.57905: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853429.57961: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853429.58031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853429.58047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853429.58375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.58379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.58382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.58384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.58386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.58388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.58390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.58392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.58422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.58442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.58491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.58535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.58567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.58624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.58651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.58849: variable 'network_connections' from source: play vars 18714 1726853429.58867: variable 'profile' from source: play vars 18714 1726853429.58958: variable 'profile' from source: play vars 18714 1726853429.58967: variable 'interface' from source: set_fact 18714 1726853429.59038: variable 'interface' from source: set_fact 18714 1726853429.59128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853429.59326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853429.59386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853429.59476: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853429.59483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853429.59715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853429.59719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853429.59721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.59723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853429.59880: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853429.60146: variable 'network_connections' from source: play vars 18714 1726853429.60157: variable 'profile' from source: play vars 18714 1726853429.60227: variable 'profile' from source: play vars 18714 1726853429.60238: variable 'interface' from source: set_fact 18714 1726853429.60315: variable 'interface' from source: set_fact 18714 1726853429.60348: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853429.60364: when evaluation is False, skipping this task 18714 1726853429.60377: _execute() done 18714 1726853429.60387: dumping result to json 18714 1726853429.60397: done dumping result, returning 18714 1726853429.60413: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000043] 18714 1726853429.60442: sending task result for task 02083763-bbaf-e784-4f7d-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853429.60626: no more pending results, returning what we have 18714 1726853429.60632: results queue empty 18714 1726853429.60634: checking for any_errors_fatal 18714 1726853429.60639: done checking for any_errors_fatal 18714 1726853429.60640: checking for max_fail_percentage 18714 1726853429.60642: done checking for max_fail_percentage 18714 1726853429.60644: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.60644: done checking to see if all hosts have failed 18714 1726853429.60645: getting the remaining hosts for this loop 18714 1726853429.60647: done getting the remaining hosts for this loop 18714 1726853429.60651: getting the next task for host managed_node1 18714 1726853429.60660: done getting next task for host managed_node1 18714 1726853429.60666: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18714 1726853429.60668: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.60784: getting variables 18714 1726853429.60786: in VariableManager get_vars() 18714 1726853429.60931: Calling all_inventory to load vars for managed_node1 18714 1726853429.60934: Calling groups_inventory to load vars for managed_node1 18714 1726853429.60937: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.60948: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.60952: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.60955: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.61485: done sending task result for task 02083763-bbaf-e784-4f7d-000000000043 18714 1726853429.61488: WORKER PROCESS EXITING 18714 1726853429.63262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853429.65840: done with get_vars() 18714 1726853429.65870: done getting variables 18714 1726853429.66249: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:30:29 -0400 (0:00:00.130) 0:00:26.046 ****** 18714 1726853429.66378: entering _queue_task() for managed_node1/package 18714 1726853429.67544: worker is 1 (out of 1 available) 18714 1726853429.67554: exiting _queue_task() for managed_node1/package 18714 1726853429.67562: done queuing things up, now waiting for results queue to drain 18714 1726853429.67563: waiting for pending results... 18714 1726853429.67759: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18714 1726853429.67854: in run() - task 02083763-bbaf-e784-4f7d-000000000044 18714 1726853429.67874: variable 'ansible_search_path' from source: unknown 18714 1726853429.68079: variable 'ansible_search_path' from source: unknown 18714 1726853429.68117: calling self._execute() 18714 1726853429.68212: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.68217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.68226: variable 'omit' from source: magic vars 18714 1726853429.69579: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.69583: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.69662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853429.70340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853429.70476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853429.70480: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853429.70629: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853429.70757: variable 'network_packages' from source: role '' defaults 18714 1726853429.71172: variable '__network_provider_setup' from source: role '' defaults 18714 1726853429.71184: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853429.71250: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853429.71262: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853429.71529: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853429.71910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853429.76029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853429.76109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853429.76130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853429.76282: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853429.76434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853429.76477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.76503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.76528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.76570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.76792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.76960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.76963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.76965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.76968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.76974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.77562: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18714 1726853429.77705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.77729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.77752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.77794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.77808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.77911: variable 'ansible_python' from source: facts 18714 1726853429.77924: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18714 1726853429.78020: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853429.78085: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853429.78226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.78249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.78277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.78314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.78328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.78397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853429.78412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853429.78453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.78463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853429.78479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853429.78615: variable 'network_connections' from source: play vars 18714 1726853429.78622: variable 'profile' from source: play vars 18714 1726853429.78724: variable 'profile' from source: play vars 18714 1726853429.78728: variable 'interface' from source: set_fact 18714 1726853429.78797: variable 'interface' from source: set_fact 18714 1726853429.78866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853429.78895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853429.78924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853429.78955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853429.79003: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853429.79464: variable 'network_connections' from source: play vars 18714 1726853429.79467: variable 'profile' from source: play vars 18714 1726853429.79968: variable 'profile' from source: play vars 18714 1726853429.79979: variable 'interface' from source: set_fact 18714 1726853429.80189: variable 'interface' from source: set_fact 18714 1726853429.80192: variable '__network_packages_default_wireless' from source: role '' defaults 18714 1726853429.80194: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853429.80736: variable 'network_connections' from source: play vars 18714 1726853429.80739: variable 'profile' from source: play vars 18714 1726853429.80819: variable 'profile' from source: play vars 18714 1726853429.80822: variable 'interface' from source: set_fact 18714 1726853429.80975: variable 'interface' from source: set_fact 18714 1726853429.81009: variable '__network_packages_default_team' from source: role '' defaults 18714 1726853429.81127: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853429.81613: variable 'network_connections' from source: play vars 18714 1726853429.81618: variable 'profile' from source: play vars 18714 1726853429.81697: variable 'profile' from source: play vars 18714 1726853429.81705: variable 'interface' from source: set_fact 18714 1726853429.81922: variable 'interface' from source: set_fact 18714 1726853429.82087: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853429.82149: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853429.82158: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853429.82308: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853429.82616: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18714 1726853429.83145: variable 'network_connections' from source: play vars 18714 1726853429.83148: variable 'profile' from source: play vars 18714 1726853429.83408: variable 'profile' from source: play vars 18714 1726853429.83411: variable 'interface' from source: set_fact 18714 1726853429.83413: variable 'interface' from source: set_fact 18714 1726853429.83415: variable 'ansible_distribution' from source: facts 18714 1726853429.83417: variable '__network_rh_distros' from source: role '' defaults 18714 1726853429.83419: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.83421: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18714 1726853429.83444: variable 'ansible_distribution' from source: facts 18714 1726853429.83447: variable '__network_rh_distros' from source: role '' defaults 18714 1726853429.83452: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.83474: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18714 1726853429.83624: variable 'ansible_distribution' from source: facts 18714 1726853429.83627: variable '__network_rh_distros' from source: role '' defaults 18714 1726853429.83630: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.83664: variable 'network_provider' from source: set_fact 18714 1726853429.83680: variable 'ansible_facts' from source: unknown 18714 1726853429.84409: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18714 1726853429.84413: when evaluation is False, skipping this task 18714 1726853429.84415: _execute() done 18714 1726853429.84418: dumping result to json 18714 1726853429.84420: done dumping result, returning 18714 1726853429.84430: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-e784-4f7d-000000000044] 18714 1726853429.84432: sending task result for task 02083763-bbaf-e784-4f7d-000000000044 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18714 1726853429.84642: no more pending results, returning what we have 18714 1726853429.84646: results queue empty 18714 1726853429.84647: checking for any_errors_fatal 18714 1726853429.84654: done checking for any_errors_fatal 18714 1726853429.84655: checking for max_fail_percentage 18714 1726853429.84657: done checking for max_fail_percentage 18714 1726853429.84658: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.84659: done checking to see if all hosts have failed 18714 1726853429.84659: getting the remaining hosts for this loop 18714 1726853429.84661: done getting the remaining hosts for this loop 18714 1726853429.84665: getting the next task for host managed_node1 18714 1726853429.84674: done getting next task for host managed_node1 18714 1726853429.84679: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18714 1726853429.84681: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.84696: getting variables 18714 1726853429.84697: in VariableManager get_vars() 18714 1726853429.84855: Calling all_inventory to load vars for managed_node1 18714 1726853429.84859: Calling groups_inventory to load vars for managed_node1 18714 1726853429.84862: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.84876: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.84884: done sending task result for task 02083763-bbaf-e784-4f7d-000000000044 18714 1726853429.84897: WORKER PROCESS EXITING 18714 1726853429.84892: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.84902: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.86861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853429.88772: done with get_vars() 18714 1726853429.88811: done getting variables 18714 1726853429.88876: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:30:29 -0400 (0:00:00.226) 0:00:26.272 ****** 18714 1726853429.88913: entering _queue_task() for managed_node1/package 18714 1726853429.89260: worker is 1 (out of 1 available) 18714 1726853429.89278: exiting _queue_task() for managed_node1/package 18714 1726853429.89292: done queuing things up, now waiting for results queue to drain 18714 1726853429.89293: waiting for pending results... 18714 1726853429.89598: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18714 1726853429.89725: in run() - task 02083763-bbaf-e784-4f7d-000000000045 18714 1726853429.89738: variable 'ansible_search_path' from source: unknown 18714 1726853429.89741: variable 'ansible_search_path' from source: unknown 18714 1726853429.89807: calling self._execute() 18714 1726853429.90119: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.90131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.90148: variable 'omit' from source: magic vars 18714 1726853429.90586: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.90604: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.90743: variable 'network_state' from source: role '' defaults 18714 1726853429.90776: Evaluated conditional (network_state != {}): False 18714 1726853429.90784: when evaluation is False, skipping this task 18714 1726853429.90793: _execute() done 18714 1726853429.90801: dumping result to json 18714 1726853429.90808: done dumping result, returning 18714 1726853429.90819: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-e784-4f7d-000000000045] 18714 1726853429.90874: sending task result for task 02083763-bbaf-e784-4f7d-000000000045 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853429.91006: no more pending results, returning what we have 18714 1726853429.91011: results queue empty 18714 1726853429.91012: checking for any_errors_fatal 18714 1726853429.91018: done checking for any_errors_fatal 18714 1726853429.91019: checking for max_fail_percentage 18714 1726853429.91021: done checking for max_fail_percentage 18714 1726853429.91022: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.91023: done checking to see if all hosts have failed 18714 1726853429.91023: getting the remaining hosts for this loop 18714 1726853429.91025: done getting the remaining hosts for this loop 18714 1726853429.91032: getting the next task for host managed_node1 18714 1726853429.91040: done getting next task for host managed_node1 18714 1726853429.91044: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18714 1726853429.91047: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.91065: getting variables 18714 1726853429.91068: in VariableManager get_vars() 18714 1726853429.91114: Calling all_inventory to load vars for managed_node1 18714 1726853429.91117: Calling groups_inventory to load vars for managed_node1 18714 1726853429.91120: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.91132: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.91136: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.91139: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.91790: done sending task result for task 02083763-bbaf-e784-4f7d-000000000045 18714 1726853429.91794: WORKER PROCESS EXITING 18714 1726853429.93404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853429.95200: done with get_vars() 18714 1726853429.95227: done getting variables 18714 1726853429.95494: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:30:29 -0400 (0:00:00.066) 0:00:26.338 ****** 18714 1726853429.95525: entering _queue_task() for managed_node1/package 18714 1726853429.96066: worker is 1 (out of 1 available) 18714 1726853429.96089: exiting _queue_task() for managed_node1/package 18714 1726853429.96101: done queuing things up, now waiting for results queue to drain 18714 1726853429.96102: waiting for pending results... 18714 1726853429.96432: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18714 1726853429.96557: in run() - task 02083763-bbaf-e784-4f7d-000000000046 18714 1726853429.96583: variable 'ansible_search_path' from source: unknown 18714 1726853429.96596: variable 'ansible_search_path' from source: unknown 18714 1726853429.96648: calling self._execute() 18714 1726853429.96762: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853429.96778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853429.96796: variable 'omit' from source: magic vars 18714 1726853429.97248: variable 'ansible_distribution_major_version' from source: facts 18714 1726853429.97251: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853429.97374: variable 'network_state' from source: role '' defaults 18714 1726853429.97398: Evaluated conditional (network_state != {}): False 18714 1726853429.97404: when evaluation is False, skipping this task 18714 1726853429.97496: _execute() done 18714 1726853429.97499: dumping result to json 18714 1726853429.97502: done dumping result, returning 18714 1726853429.97505: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-e784-4f7d-000000000046] 18714 1726853429.97507: sending task result for task 02083763-bbaf-e784-4f7d-000000000046 18714 1726853429.97579: done sending task result for task 02083763-bbaf-e784-4f7d-000000000046 18714 1726853429.97582: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853429.97637: no more pending results, returning what we have 18714 1726853429.97642: results queue empty 18714 1726853429.97644: checking for any_errors_fatal 18714 1726853429.97651: done checking for any_errors_fatal 18714 1726853429.97652: checking for max_fail_percentage 18714 1726853429.97654: done checking for max_fail_percentage 18714 1726853429.97655: checking to see if all hosts have failed and the running result is not ok 18714 1726853429.97656: done checking to see if all hosts have failed 18714 1726853429.97657: getting the remaining hosts for this loop 18714 1726853429.97658: done getting the remaining hosts for this loop 18714 1726853429.97662: getting the next task for host managed_node1 18714 1726853429.97673: done getting next task for host managed_node1 18714 1726853429.97678: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18714 1726853429.97680: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853429.97781: getting variables 18714 1726853429.97783: in VariableManager get_vars() 18714 1726853429.97829: Calling all_inventory to load vars for managed_node1 18714 1726853429.97832: Calling groups_inventory to load vars for managed_node1 18714 1726853429.97834: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853429.97846: Calling all_plugins_play to load vars for managed_node1 18714 1726853429.97848: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853429.97851: Calling groups_plugins_play to load vars for managed_node1 18714 1726853429.99450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853430.06787: done with get_vars() 18714 1726853430.06817: done getting variables 18714 1726853430.06864: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:30:30 -0400 (0:00:00.113) 0:00:26.452 ****** 18714 1726853430.06894: entering _queue_task() for managed_node1/service 18714 1726853430.07317: worker is 1 (out of 1 available) 18714 1726853430.07331: exiting _queue_task() for managed_node1/service 18714 1726853430.07342: done queuing things up, now waiting for results queue to drain 18714 1726853430.07343: waiting for pending results... 18714 1726853430.07749: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18714 1726853430.07755: in run() - task 02083763-bbaf-e784-4f7d-000000000047 18714 1726853430.07758: variable 'ansible_search_path' from source: unknown 18714 1726853430.07761: variable 'ansible_search_path' from source: unknown 18714 1726853430.07800: calling self._execute() 18714 1726853430.07911: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853430.07955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853430.07959: variable 'omit' from source: magic vars 18714 1726853430.08349: variable 'ansible_distribution_major_version' from source: facts 18714 1726853430.08366: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853430.08521: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853430.08765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853430.11102: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853430.11159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853430.11189: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853430.11215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853430.11236: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853430.11300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.11320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.11340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.11373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.11384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.11417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.11436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.11457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.11485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.11496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.11525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.11542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.11564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.11592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.11602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.11722: variable 'network_connections' from source: play vars 18714 1726853430.11732: variable 'profile' from source: play vars 18714 1726853430.11791: variable 'profile' from source: play vars 18714 1726853430.11794: variable 'interface' from source: set_fact 18714 1726853430.11839: variable 'interface' from source: set_fact 18714 1726853430.11894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853430.12016: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853430.12042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853430.12069: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853430.12098: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853430.12157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853430.12161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853430.12229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.12232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853430.12377: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853430.12506: variable 'network_connections' from source: play vars 18714 1726853430.12514: variable 'profile' from source: play vars 18714 1726853430.12576: variable 'profile' from source: play vars 18714 1726853430.12580: variable 'interface' from source: set_fact 18714 1726853430.12637: variable 'interface' from source: set_fact 18714 1726853430.12664: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853430.12667: when evaluation is False, skipping this task 18714 1726853430.12670: _execute() done 18714 1726853430.12675: dumping result to json 18714 1726853430.12977: done dumping result, returning 18714 1726853430.12980: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000047] 18714 1726853430.12991: sending task result for task 02083763-bbaf-e784-4f7d-000000000047 18714 1726853430.13202: done sending task result for task 02083763-bbaf-e784-4f7d-000000000047 18714 1726853430.13205: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853430.13311: no more pending results, returning what we have 18714 1726853430.13314: results queue empty 18714 1726853430.13315: checking for any_errors_fatal 18714 1726853430.13320: done checking for any_errors_fatal 18714 1726853430.13320: checking for max_fail_percentage 18714 1726853430.13322: done checking for max_fail_percentage 18714 1726853430.13323: checking to see if all hosts have failed and the running result is not ok 18714 1726853430.13323: done checking to see if all hosts have failed 18714 1726853430.13324: getting the remaining hosts for this loop 18714 1726853430.13325: done getting the remaining hosts for this loop 18714 1726853430.13328: getting the next task for host managed_node1 18714 1726853430.13333: done getting next task for host managed_node1 18714 1726853430.13336: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18714 1726853430.13338: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853430.13350: getting variables 18714 1726853430.13351: in VariableManager get_vars() 18714 1726853430.13482: Calling all_inventory to load vars for managed_node1 18714 1726853430.13485: Calling groups_inventory to load vars for managed_node1 18714 1726853430.13487: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853430.13502: Calling all_plugins_play to load vars for managed_node1 18714 1726853430.13505: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853430.13508: Calling groups_plugins_play to load vars for managed_node1 18714 1726853430.15155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853430.17478: done with get_vars() 18714 1726853430.17510: done getting variables 18714 1726853430.17570: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:30:30 -0400 (0:00:00.107) 0:00:26.559 ****** 18714 1726853430.17605: entering _queue_task() for managed_node1/service 18714 1726853430.18525: worker is 1 (out of 1 available) 18714 1726853430.18538: exiting _queue_task() for managed_node1/service 18714 1726853430.18548: done queuing things up, now waiting for results queue to drain 18714 1726853430.18549: waiting for pending results... 18714 1726853430.18996: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18714 1726853430.19332: in run() - task 02083763-bbaf-e784-4f7d-000000000048 18714 1726853430.19541: variable 'ansible_search_path' from source: unknown 18714 1726853430.19545: variable 'ansible_search_path' from source: unknown 18714 1726853430.19547: calling self._execute() 18714 1726853430.19710: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853430.19722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853430.19735: variable 'omit' from source: magic vars 18714 1726853430.20677: variable 'ansible_distribution_major_version' from source: facts 18714 1726853430.20681: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853430.21127: variable 'network_provider' from source: set_fact 18714 1726853430.21137: variable 'network_state' from source: role '' defaults 18714 1726853430.21156: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18714 1726853430.21167: variable 'omit' from source: magic vars 18714 1726853430.21396: variable 'omit' from source: magic vars 18714 1726853430.21399: variable 'network_service_name' from source: role '' defaults 18714 1726853430.21455: variable 'network_service_name' from source: role '' defaults 18714 1726853430.21721: variable '__network_provider_setup' from source: role '' defaults 18714 1726853430.21733: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853430.21849: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853430.21905: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853430.22062: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853430.22597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853430.27421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853430.27486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853430.27635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853430.27715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853430.27754: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853430.27898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.27996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.28139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.28185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.28267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.28324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.28420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.28502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.28610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.28614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.29190: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18714 1726853430.29476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.29602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.29633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.29682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.29707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.29875: variable 'ansible_python' from source: facts 18714 1726853430.29946: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18714 1726853430.30120: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853430.30376: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853430.30620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.30649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.30895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.30898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.30901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.30998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853430.31041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853430.31075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.31166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853430.31453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853430.31511: variable 'network_connections' from source: play vars 18714 1726853430.31572: variable 'profile' from source: play vars 18714 1726853430.31646: variable 'profile' from source: play vars 18714 1726853430.31788: variable 'interface' from source: set_fact 18714 1726853430.31857: variable 'interface' from source: set_fact 18714 1726853430.32086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853430.33242: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853430.33407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853430.33463: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853430.33614: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853430.33736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853430.33846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853430.33894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853430.34077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853430.34113: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853430.34723: variable 'network_connections' from source: play vars 18714 1726853430.34824: variable 'profile' from source: play vars 18714 1726853430.34906: variable 'profile' from source: play vars 18714 1726853430.35141: variable 'interface' from source: set_fact 18714 1726853430.35144: variable 'interface' from source: set_fact 18714 1726853430.35147: variable '__network_packages_default_wireless' from source: role '' defaults 18714 1726853430.35327: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853430.35935: variable 'network_connections' from source: play vars 18714 1726853430.35945: variable 'profile' from source: play vars 18714 1726853430.36085: variable 'profile' from source: play vars 18714 1726853430.36128: variable 'interface' from source: set_fact 18714 1726853430.36337: variable 'interface' from source: set_fact 18714 1726853430.36374: variable '__network_packages_default_team' from source: role '' defaults 18714 1726853430.36576: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853430.37129: variable 'network_connections' from source: play vars 18714 1726853430.37184: variable 'profile' from source: play vars 18714 1726853430.37336: variable 'profile' from source: play vars 18714 1726853430.37347: variable 'interface' from source: set_fact 18714 1726853430.37473: variable 'interface' from source: set_fact 18714 1726853430.37661: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853430.37777: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853430.37781: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853430.37904: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853430.38476: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18714 1726853430.39503: variable 'network_connections' from source: play vars 18714 1726853430.39506: variable 'profile' from source: play vars 18714 1726853430.39568: variable 'profile' from source: play vars 18714 1726853430.39603: variable 'interface' from source: set_fact 18714 1726853430.39813: variable 'interface' from source: set_fact 18714 1726853430.39816: variable 'ansible_distribution' from source: facts 18714 1726853430.39818: variable '__network_rh_distros' from source: role '' defaults 18714 1726853430.39820: variable 'ansible_distribution_major_version' from source: facts 18714 1726853430.39844: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18714 1726853430.40233: variable 'ansible_distribution' from source: facts 18714 1726853430.40465: variable '__network_rh_distros' from source: role '' defaults 18714 1726853430.40468: variable 'ansible_distribution_major_version' from source: facts 18714 1726853430.40472: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18714 1726853430.40812: variable 'ansible_distribution' from source: facts 18714 1726853430.40816: variable '__network_rh_distros' from source: role '' defaults 18714 1726853430.40818: variable 'ansible_distribution_major_version' from source: facts 18714 1726853430.40855: variable 'network_provider' from source: set_fact 18714 1726853430.40887: variable 'omit' from source: magic vars 18714 1726853430.41005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853430.41019: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853430.41223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853430.41226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853430.41229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853430.41231: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853430.41423: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853430.41426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853430.41986: Set connection var ansible_shell_executable to /bin/sh 18714 1726853430.41989: Set connection var ansible_timeout to 10 18714 1726853430.41992: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853430.41994: Set connection var ansible_connection to ssh 18714 1726853430.42032: Set connection var ansible_shell_type to sh 18714 1726853430.42277: Set connection var ansible_pipelining to False 18714 1726853430.42282: variable 'ansible_shell_executable' from source: unknown 18714 1726853430.42285: variable 'ansible_connection' from source: unknown 18714 1726853430.42288: variable 'ansible_module_compression' from source: unknown 18714 1726853430.42290: variable 'ansible_shell_type' from source: unknown 18714 1726853430.42292: variable 'ansible_shell_executable' from source: unknown 18714 1726853430.42295: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853430.42302: variable 'ansible_pipelining' from source: unknown 18714 1726853430.42305: variable 'ansible_timeout' from source: unknown 18714 1726853430.42307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853430.42642: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853430.42650: variable 'omit' from source: magic vars 18714 1726853430.42859: starting attempt loop 18714 1726853430.42863: running the handler 18714 1726853430.42865: variable 'ansible_facts' from source: unknown 18714 1726853430.45206: _low_level_execute_command(): starting 18714 1726853430.45219: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853430.46998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853430.47002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853430.47188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853430.47239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853430.48980: stdout chunk (state=3): >>>/root <<< 18714 1726853430.49187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853430.49366: stderr chunk (state=3): >>><<< 18714 1726853430.49369: stdout chunk (state=3): >>><<< 18714 1726853430.49387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853430.49406: _low_level_execute_command(): starting 18714 1726853430.49416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208 `" && echo ansible-tmp-1726853430.4939418-19940-25369775112208="` echo /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208 `" ) && sleep 0' 18714 1726853430.50500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853430.50513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853430.50584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853430.50830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853430.50863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853430.50991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853430.52890: stdout chunk (state=3): >>>ansible-tmp-1726853430.4939418-19940-25369775112208=/root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208 <<< 18714 1726853430.53036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853430.53054: stdout chunk (state=3): >>><<< 18714 1726853430.53070: stderr chunk (state=3): >>><<< 18714 1726853430.53090: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853430.4939418-19940-25369775112208=/root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853430.53123: variable 'ansible_module_compression' from source: unknown 18714 1726853430.53194: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18714 1726853430.53269: variable 'ansible_facts' from source: unknown 18714 1726853430.53528: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py 18714 1726853430.53655: Sending initial data 18714 1726853430.53658: Sent initial data (155 bytes) 18714 1726853430.54283: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853430.54336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853430.54366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853430.55909: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853430.55945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853430.56000: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpl5ww6cfn /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py <<< 18714 1726853430.56027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py" <<< 18714 1726853430.56031: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18714 1726853430.56067: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpl5ww6cfn" to remote "/root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py" <<< 18714 1726853430.57953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853430.57963: stderr chunk (state=3): >>><<< 18714 1726853430.57965: stdout chunk (state=3): >>><<< 18714 1726853430.57975: done transferring module to remote 18714 1726853430.57990: _low_level_execute_command(): starting 18714 1726853430.57993: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/ /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py && sleep 0' 18714 1726853430.58411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853430.58416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853430.58443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853430.58446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853430.58449: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853430.58458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853430.58510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853430.58516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853430.58557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853430.60340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853430.60343: stdout chunk (state=3): >>><<< 18714 1726853430.60346: stderr chunk (state=3): >>><<< 18714 1726853430.60423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853430.60426: _low_level_execute_command(): starting 18714 1726853430.60429: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/AnsiballZ_systemd.py && sleep 0' 18714 1726853430.60977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853430.60981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853430.61057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853430.61119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853430.61123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853430.61166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853430.90245: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10653696", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304824832", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "877145000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 18714 1726853430.90287: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-i<<< 18714 1726853430.90309: stdout chunk (state=3): >>>nit-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18714 1726853430.92128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853430.92155: stderr chunk (state=3): >>><<< 18714 1726853430.92159: stdout chunk (state=3): >>><<< 18714 1726853430.92175: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10653696", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304824832", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "877145000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853430.92308: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853430.92322: _low_level_execute_command(): starting 18714 1726853430.92327: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853430.4939418-19940-25369775112208/ > /dev/null 2>&1 && sleep 0' 18714 1726853430.92954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853430.92958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853430.92990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853430.94777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853430.94781: stderr chunk (state=3): >>><<< 18714 1726853430.94784: stdout chunk (state=3): >>><<< 18714 1726853430.94989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853430.94993: handler run complete 18714 1726853430.94996: attempt loop complete, returning result 18714 1726853430.94998: _execute() done 18714 1726853430.95000: dumping result to json 18714 1726853430.95002: done dumping result, returning 18714 1726853430.95004: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-e784-4f7d-000000000048] 18714 1726853430.95008: sending task result for task 02083763-bbaf-e784-4f7d-000000000048 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853430.95560: no more pending results, returning what we have 18714 1726853430.95565: results queue empty 18714 1726853430.95567: checking for any_errors_fatal 18714 1726853430.95622: done checking for any_errors_fatal 18714 1726853430.95623: checking for max_fail_percentage 18714 1726853430.95626: done checking for max_fail_percentage 18714 1726853430.95627: checking to see if all hosts have failed and the running result is not ok 18714 1726853430.95628: done checking to see if all hosts have failed 18714 1726853430.95628: getting the remaining hosts for this loop 18714 1726853430.95630: done getting the remaining hosts for this loop 18714 1726853430.95634: getting the next task for host managed_node1 18714 1726853430.95737: done getting next task for host managed_node1 18714 1726853430.95741: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18714 1726853430.95743: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853430.95753: getting variables 18714 1726853430.95755: in VariableManager get_vars() 18714 1726853430.95968: Calling all_inventory to load vars for managed_node1 18714 1726853430.95973: Calling groups_inventory to load vars for managed_node1 18714 1726853430.96009: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853430.96023: Calling all_plugins_play to load vars for managed_node1 18714 1726853430.96026: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853430.96031: Calling groups_plugins_play to load vars for managed_node1 18714 1726853430.96550: done sending task result for task 02083763-bbaf-e784-4f7d-000000000048 18714 1726853430.96556: WORKER PROCESS EXITING 18714 1726853430.97026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853430.97911: done with get_vars() 18714 1726853430.97929: done getting variables 18714 1726853430.97974: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:30:30 -0400 (0:00:00.803) 0:00:27.363 ****** 18714 1726853430.97997: entering _queue_task() for managed_node1/service 18714 1726853430.98232: worker is 1 (out of 1 available) 18714 1726853430.98245: exiting _queue_task() for managed_node1/service 18714 1726853430.98259: done queuing things up, now waiting for results queue to drain 18714 1726853430.98260: waiting for pending results... 18714 1726853430.98475: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18714 1726853430.98559: in run() - task 02083763-bbaf-e784-4f7d-000000000049 18714 1726853430.98584: variable 'ansible_search_path' from source: unknown 18714 1726853430.98590: variable 'ansible_search_path' from source: unknown 18714 1726853430.98631: calling self._execute() 18714 1726853430.98712: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853430.98716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853430.98733: variable 'omit' from source: magic vars 18714 1726853430.99258: variable 'ansible_distribution_major_version' from source: facts 18714 1726853430.99264: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853430.99521: variable 'network_provider' from source: set_fact 18714 1726853430.99525: Evaluated conditional (network_provider == "nm"): True 18714 1726853430.99625: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853430.99760: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853430.99988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853431.01786: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853431.01836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853431.01874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853431.01898: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853431.01927: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853431.02036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853431.02059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853431.02099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853431.02124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853431.02135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853431.02200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853431.02237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853431.02258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853431.02314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853431.02351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853431.02396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853431.02421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853431.02442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853431.02502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853431.02525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853431.02640: variable 'network_connections' from source: play vars 18714 1726853431.02667: variable 'profile' from source: play vars 18714 1726853431.02729: variable 'profile' from source: play vars 18714 1726853431.02732: variable 'interface' from source: set_fact 18714 1726853431.02800: variable 'interface' from source: set_fact 18714 1726853431.02865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853431.03047: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853431.03077: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853431.03102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853431.03136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853431.03181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853431.03202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853431.03220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853431.03238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853431.03278: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853431.03509: variable 'network_connections' from source: play vars 18714 1726853431.03513: variable 'profile' from source: play vars 18714 1726853431.03574: variable 'profile' from source: play vars 18714 1726853431.03577: variable 'interface' from source: set_fact 18714 1726853431.03619: variable 'interface' from source: set_fact 18714 1726853431.03644: Evaluated conditional (__network_wpa_supplicant_required): False 18714 1726853431.03647: when evaluation is False, skipping this task 18714 1726853431.03649: _execute() done 18714 1726853431.03659: dumping result to json 18714 1726853431.03662: done dumping result, returning 18714 1726853431.03666: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-e784-4f7d-000000000049] 18714 1726853431.03669: sending task result for task 02083763-bbaf-e784-4f7d-000000000049 18714 1726853431.03755: done sending task result for task 02083763-bbaf-e784-4f7d-000000000049 18714 1726853431.03758: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18714 1726853431.03809: no more pending results, returning what we have 18714 1726853431.03812: results queue empty 18714 1726853431.03813: checking for any_errors_fatal 18714 1726853431.03835: done checking for any_errors_fatal 18714 1726853431.03836: checking for max_fail_percentage 18714 1726853431.03838: done checking for max_fail_percentage 18714 1726853431.03839: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.03840: done checking to see if all hosts have failed 18714 1726853431.03841: getting the remaining hosts for this loop 18714 1726853431.03842: done getting the remaining hosts for this loop 18714 1726853431.03846: getting the next task for host managed_node1 18714 1726853431.03852: done getting next task for host managed_node1 18714 1726853431.03857: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18714 1726853431.03858: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.03875: getting variables 18714 1726853431.03877: in VariableManager get_vars() 18714 1726853431.03917: Calling all_inventory to load vars for managed_node1 18714 1726853431.03919: Calling groups_inventory to load vars for managed_node1 18714 1726853431.03922: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.03933: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.03936: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.03939: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.05666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.07769: done with get_vars() 18714 1726853431.07807: done getting variables 18714 1726853431.07893: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:30:31 -0400 (0:00:00.099) 0:00:27.462 ****** 18714 1726853431.07941: entering _queue_task() for managed_node1/service 18714 1726853431.08601: worker is 1 (out of 1 available) 18714 1726853431.08610: exiting _queue_task() for managed_node1/service 18714 1726853431.08621: done queuing things up, now waiting for results queue to drain 18714 1726853431.08622: waiting for pending results... 18714 1726853431.08820: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18714 1726853431.08944: in run() - task 02083763-bbaf-e784-4f7d-00000000004a 18714 1726853431.08948: variable 'ansible_search_path' from source: unknown 18714 1726853431.08953: variable 'ansible_search_path' from source: unknown 18714 1726853431.09063: calling self._execute() 18714 1726853431.09193: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.09207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.09211: variable 'omit' from source: magic vars 18714 1726853431.09641: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.09649: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.09760: variable 'network_provider' from source: set_fact 18714 1726853431.09766: Evaluated conditional (network_provider == "initscripts"): False 18714 1726853431.09769: when evaluation is False, skipping this task 18714 1726853431.09773: _execute() done 18714 1726853431.09776: dumping result to json 18714 1726853431.09779: done dumping result, returning 18714 1726853431.09857: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-e784-4f7d-00000000004a] 18714 1726853431.09861: sending task result for task 02083763-bbaf-e784-4f7d-00000000004a 18714 1726853431.09922: done sending task result for task 02083763-bbaf-e784-4f7d-00000000004a 18714 1726853431.09925: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853431.09987: no more pending results, returning what we have 18714 1726853431.09991: results queue empty 18714 1726853431.09992: checking for any_errors_fatal 18714 1726853431.09999: done checking for any_errors_fatal 18714 1726853431.10000: checking for max_fail_percentage 18714 1726853431.10002: done checking for max_fail_percentage 18714 1726853431.10003: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.10004: done checking to see if all hosts have failed 18714 1726853431.10004: getting the remaining hosts for this loop 18714 1726853431.10006: done getting the remaining hosts for this loop 18714 1726853431.10010: getting the next task for host managed_node1 18714 1726853431.10017: done getting next task for host managed_node1 18714 1726853431.10020: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18714 1726853431.10023: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.10038: getting variables 18714 1726853431.10040: in VariableManager get_vars() 18714 1726853431.10083: Calling all_inventory to load vars for managed_node1 18714 1726853431.10086: Calling groups_inventory to load vars for managed_node1 18714 1726853431.10088: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.10100: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.10103: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.10106: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.11728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.13342: done with get_vars() 18714 1726853431.13366: done getting variables 18714 1726853431.13424: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:30:31 -0400 (0:00:00.055) 0:00:27.517 ****** 18714 1726853431.13457: entering _queue_task() for managed_node1/copy 18714 1726853431.13785: worker is 1 (out of 1 available) 18714 1726853431.13797: exiting _queue_task() for managed_node1/copy 18714 1726853431.13808: done queuing things up, now waiting for results queue to drain 18714 1726853431.13809: waiting for pending results... 18714 1726853431.14196: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18714 1726853431.14202: in run() - task 02083763-bbaf-e784-4f7d-00000000004b 18714 1726853431.14278: variable 'ansible_search_path' from source: unknown 18714 1726853431.14284: variable 'ansible_search_path' from source: unknown 18714 1726853431.14288: calling self._execute() 18714 1726853431.14378: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.14382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.14386: variable 'omit' from source: magic vars 18714 1726853431.14841: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.14845: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.14889: variable 'network_provider' from source: set_fact 18714 1726853431.14896: Evaluated conditional (network_provider == "initscripts"): False 18714 1726853431.14899: when evaluation is False, skipping this task 18714 1726853431.14902: _execute() done 18714 1726853431.14904: dumping result to json 18714 1726853431.14910: done dumping result, returning 18714 1726853431.14919: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-e784-4f7d-00000000004b] 18714 1726853431.14923: sending task result for task 02083763-bbaf-e784-4f7d-00000000004b 18714 1726853431.15023: done sending task result for task 02083763-bbaf-e784-4f7d-00000000004b 18714 1726853431.15026: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18714 1726853431.15103: no more pending results, returning what we have 18714 1726853431.15107: results queue empty 18714 1726853431.15109: checking for any_errors_fatal 18714 1726853431.15115: done checking for any_errors_fatal 18714 1726853431.15117: checking for max_fail_percentage 18714 1726853431.15119: done checking for max_fail_percentage 18714 1726853431.15120: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.15121: done checking to see if all hosts have failed 18714 1726853431.15121: getting the remaining hosts for this loop 18714 1726853431.15123: done getting the remaining hosts for this loop 18714 1726853431.15127: getting the next task for host managed_node1 18714 1726853431.15134: done getting next task for host managed_node1 18714 1726853431.15138: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18714 1726853431.15140: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.15158: getting variables 18714 1726853431.15160: in VariableManager get_vars() 18714 1726853431.15200: Calling all_inventory to load vars for managed_node1 18714 1726853431.15203: Calling groups_inventory to load vars for managed_node1 18714 1726853431.15206: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.15218: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.15221: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.15224: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.16746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.18503: done with get_vars() 18714 1726853431.18525: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:30:31 -0400 (0:00:00.051) 0:00:27.569 ****** 18714 1726853431.18609: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18714 1726853431.18913: worker is 1 (out of 1 available) 18714 1726853431.18926: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18714 1726853431.18938: done queuing things up, now waiting for results queue to drain 18714 1726853431.18939: waiting for pending results... 18714 1726853431.19383: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18714 1726853431.19400: in run() - task 02083763-bbaf-e784-4f7d-00000000004c 18714 1726853431.19405: variable 'ansible_search_path' from source: unknown 18714 1726853431.19408: variable 'ansible_search_path' from source: unknown 18714 1726853431.19411: calling self._execute() 18714 1726853431.19480: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.19485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.19496: variable 'omit' from source: magic vars 18714 1726853431.19869: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.19882: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.19887: variable 'omit' from source: magic vars 18714 1726853431.19930: variable 'omit' from source: magic vars 18714 1726853431.20099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853431.22238: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853431.22243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853431.22274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853431.22307: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853431.22332: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853431.22414: variable 'network_provider' from source: set_fact 18714 1726853431.22546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853431.22594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853431.22619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853431.22674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853431.22677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853431.22782: variable 'omit' from source: magic vars 18714 1726853431.22872: variable 'omit' from source: magic vars 18714 1726853431.22982: variable 'network_connections' from source: play vars 18714 1726853431.22999: variable 'profile' from source: play vars 18714 1726853431.23068: variable 'profile' from source: play vars 18714 1726853431.23075: variable 'interface' from source: set_fact 18714 1726853431.23139: variable 'interface' from source: set_fact 18714 1726853431.23285: variable 'omit' from source: magic vars 18714 1726853431.23294: variable '__lsr_ansible_managed' from source: task vars 18714 1726853431.23353: variable '__lsr_ansible_managed' from source: task vars 18714 1726853431.23652: Loaded config def from plugin (lookup/template) 18714 1726853431.23655: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18714 1726853431.23691: File lookup term: get_ansible_managed.j2 18714 1726853431.23695: variable 'ansible_search_path' from source: unknown 18714 1726853431.23698: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18714 1726853431.23710: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18714 1726853431.23761: variable 'ansible_search_path' from source: unknown 18714 1726853431.29545: variable 'ansible_managed' from source: unknown 18714 1726853431.29877: variable 'omit' from source: magic vars 18714 1726853431.29881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853431.29980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853431.29984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853431.29986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.29987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.29989: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853431.30272: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.30276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.30278: Set connection var ansible_shell_executable to /bin/sh 18714 1726853431.30281: Set connection var ansible_timeout to 10 18714 1726853431.30286: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853431.30294: Set connection var ansible_connection to ssh 18714 1726853431.30299: Set connection var ansible_shell_type to sh 18714 1726853431.30304: Set connection var ansible_pipelining to False 18714 1726853431.30325: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.30328: variable 'ansible_connection' from source: unknown 18714 1726853431.30330: variable 'ansible_module_compression' from source: unknown 18714 1726853431.30333: variable 'ansible_shell_type' from source: unknown 18714 1726853431.30335: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.30337: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.30342: variable 'ansible_pipelining' from source: unknown 18714 1726853431.30344: variable 'ansible_timeout' from source: unknown 18714 1726853431.30348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.30656: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853431.30666: variable 'omit' from source: magic vars 18714 1726853431.30674: starting attempt loop 18714 1726853431.30677: running the handler 18714 1726853431.30690: _low_level_execute_command(): starting 18714 1726853431.30702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853431.31585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853431.31588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853431.31591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.31594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853431.31596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853431.31598: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853431.31600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.31602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853431.31605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853431.31607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853431.31609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853431.31611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.31613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853431.31615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853431.31694: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853431.31815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853431.31852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853431.33981: stdout chunk (state=3): >>>/root <<< 18714 1726853431.33985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853431.33988: stdout chunk (state=3): >>><<< 18714 1726853431.33990: stderr chunk (state=3): >>><<< 18714 1726853431.33992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853431.33994: _low_level_execute_command(): starting 18714 1726853431.33996: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048 `" && echo ansible-tmp-1726853431.3389864-19983-61421410264048="` echo /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048 `" ) && sleep 0' 18714 1726853431.34744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853431.34761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853431.34785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.34810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853431.34883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.34940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853431.35000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853431.35291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853431.37098: stdout chunk (state=3): >>>ansible-tmp-1726853431.3389864-19983-61421410264048=/root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048 <<< 18714 1726853431.37200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853431.37230: stderr chunk (state=3): >>><<< 18714 1726853431.37234: stdout chunk (state=3): >>><<< 18714 1726853431.37250: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853431.3389864-19983-61421410264048=/root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853431.37292: variable 'ansible_module_compression' from source: unknown 18714 1726853431.37329: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18714 1726853431.37364: variable 'ansible_facts' from source: unknown 18714 1726853431.37433: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py 18714 1726853431.37534: Sending initial data 18714 1726853431.37538: Sent initial data (167 bytes) 18714 1726853431.37947: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.37964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853431.37968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853431.37972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.37985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.38038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853431.38041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853431.38089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853431.39657: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853431.39708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853431.39770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp5z4voyxf /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py <<< 18714 1726853431.39776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py" <<< 18714 1726853431.39807: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp5z4voyxf" to remote "/root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py" <<< 18714 1726853431.40738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853431.40781: stderr chunk (state=3): >>><<< 18714 1726853431.40784: stdout chunk (state=3): >>><<< 18714 1726853431.40805: done transferring module to remote 18714 1726853431.40814: _low_level_execute_command(): starting 18714 1726853431.40818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/ /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py && sleep 0' 18714 1726853431.41244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853431.41247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853431.41249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.41251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.41254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.41315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853431.41319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853431.41321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853431.41351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853431.43341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853431.43345: stdout chunk (state=3): >>><<< 18714 1726853431.43347: stderr chunk (state=3): >>><<< 18714 1726853431.43364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853431.43442: _low_level_execute_command(): starting 18714 1726853431.43445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/AnsiballZ_network_connections.py && sleep 0' 18714 1726853431.44027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853431.44041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853431.44056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.44074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853431.44091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853431.44105: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853431.44188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.44236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853431.44254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853431.44277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853431.44363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853431.74804: stdout chunk (state=3): >>> <<< 18714 1726853431.74810: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18714 1726853431.76852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853431.76898: stderr chunk (state=3): >>><<< 18714 1726853431.76901: stdout chunk (state=3): >>><<< 18714 1726853431.76919: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853431.76945: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853431.76976: _low_level_execute_command(): starting 18714 1726853431.76979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853431.3389864-19983-61421410264048/ > /dev/null 2>&1 && sleep 0' 18714 1726853431.77498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853431.77501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853431.77503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.77506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853431.77508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853431.77564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853431.77568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853431.77608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853431.79478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853431.79525: stderr chunk (state=3): >>><<< 18714 1726853431.79528: stdout chunk (state=3): >>><<< 18714 1726853431.79541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853431.79549: handler run complete 18714 1726853431.79570: attempt loop complete, returning result 18714 1726853431.79575: _execute() done 18714 1726853431.79577: dumping result to json 18714 1726853431.79583: done dumping result, returning 18714 1726853431.79593: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-e784-4f7d-00000000004c] 18714 1726853431.79596: sending task result for task 02083763-bbaf-e784-4f7d-00000000004c 18714 1726853431.79690: done sending task result for task 02083763-bbaf-e784-4f7d-00000000004c 18714 1726853431.79695: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18714 1726853431.79792: no more pending results, returning what we have 18714 1726853431.79795: results queue empty 18714 1726853431.79796: checking for any_errors_fatal 18714 1726853431.79807: done checking for any_errors_fatal 18714 1726853431.79808: checking for max_fail_percentage 18714 1726853431.79810: done checking for max_fail_percentage 18714 1726853431.79811: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.79811: done checking to see if all hosts have failed 18714 1726853431.79812: getting the remaining hosts for this loop 18714 1726853431.79814: done getting the remaining hosts for this loop 18714 1726853431.79819: getting the next task for host managed_node1 18714 1726853431.79825: done getting next task for host managed_node1 18714 1726853431.79829: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18714 1726853431.79831: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.79841: getting variables 18714 1726853431.79842: in VariableManager get_vars() 18714 1726853431.79887: Calling all_inventory to load vars for managed_node1 18714 1726853431.79890: Calling groups_inventory to load vars for managed_node1 18714 1726853431.79892: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.79904: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.79906: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.79910: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.80940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.82097: done with get_vars() 18714 1726853431.82115: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:30:31 -0400 (0:00:00.635) 0:00:28.205 ****** 18714 1726853431.82176: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18714 1726853431.82442: worker is 1 (out of 1 available) 18714 1726853431.82456: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18714 1726853431.82468: done queuing things up, now waiting for results queue to drain 18714 1726853431.82469: waiting for pending results... 18714 1726853431.82651: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18714 1726853431.82747: in run() - task 02083763-bbaf-e784-4f7d-00000000004d 18714 1726853431.82764: variable 'ansible_search_path' from source: unknown 18714 1726853431.82767: variable 'ansible_search_path' from source: unknown 18714 1726853431.82799: calling self._execute() 18714 1726853431.82875: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.82881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.82887: variable 'omit' from source: magic vars 18714 1726853431.83194: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.83203: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.83291: variable 'network_state' from source: role '' defaults 18714 1726853431.83299: Evaluated conditional (network_state != {}): False 18714 1726853431.83302: when evaluation is False, skipping this task 18714 1726853431.83305: _execute() done 18714 1726853431.83307: dumping result to json 18714 1726853431.83310: done dumping result, returning 18714 1726853431.83317: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-e784-4f7d-00000000004d] 18714 1726853431.83320: sending task result for task 02083763-bbaf-e784-4f7d-00000000004d 18714 1726853431.83406: done sending task result for task 02083763-bbaf-e784-4f7d-00000000004d 18714 1726853431.83408: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853431.83474: no more pending results, returning what we have 18714 1726853431.83478: results queue empty 18714 1726853431.83479: checking for any_errors_fatal 18714 1726853431.83489: done checking for any_errors_fatal 18714 1726853431.83490: checking for max_fail_percentage 18714 1726853431.83492: done checking for max_fail_percentage 18714 1726853431.83493: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.83493: done checking to see if all hosts have failed 18714 1726853431.83494: getting the remaining hosts for this loop 18714 1726853431.83495: done getting the remaining hosts for this loop 18714 1726853431.83499: getting the next task for host managed_node1 18714 1726853431.83506: done getting next task for host managed_node1 18714 1726853431.83510: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18714 1726853431.83512: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.83553: getting variables 18714 1726853431.83555: in VariableManager get_vars() 18714 1726853431.83609: Calling all_inventory to load vars for managed_node1 18714 1726853431.83612: Calling groups_inventory to load vars for managed_node1 18714 1726853431.83614: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.83622: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.83625: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.83631: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.84577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.85460: done with get_vars() 18714 1726853431.85480: done getting variables 18714 1726853431.85520: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:30:31 -0400 (0:00:00.033) 0:00:28.238 ****** 18714 1726853431.85542: entering _queue_task() for managed_node1/debug 18714 1726853431.85790: worker is 1 (out of 1 available) 18714 1726853431.85804: exiting _queue_task() for managed_node1/debug 18714 1726853431.85827: done queuing things up, now waiting for results queue to drain 18714 1726853431.85830: waiting for pending results... 18714 1726853431.86023: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18714 1726853431.86100: in run() - task 02083763-bbaf-e784-4f7d-00000000004e 18714 1726853431.86111: variable 'ansible_search_path' from source: unknown 18714 1726853431.86114: variable 'ansible_search_path' from source: unknown 18714 1726853431.86152: calling self._execute() 18714 1726853431.86221: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.86226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.86236: variable 'omit' from source: magic vars 18714 1726853431.86518: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.86527: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.86532: variable 'omit' from source: magic vars 18714 1726853431.86567: variable 'omit' from source: magic vars 18714 1726853431.86597: variable 'omit' from source: magic vars 18714 1726853431.86630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853431.86659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853431.86676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853431.86692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.86705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.86726: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853431.86729: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.86731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.86804: Set connection var ansible_shell_executable to /bin/sh 18714 1726853431.86809: Set connection var ansible_timeout to 10 18714 1726853431.86812: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853431.86823: Set connection var ansible_connection to ssh 18714 1726853431.86825: Set connection var ansible_shell_type to sh 18714 1726853431.86828: Set connection var ansible_pipelining to False 18714 1726853431.86844: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.86847: variable 'ansible_connection' from source: unknown 18714 1726853431.86850: variable 'ansible_module_compression' from source: unknown 18714 1726853431.86852: variable 'ansible_shell_type' from source: unknown 18714 1726853431.86857: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.86859: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.86863: variable 'ansible_pipelining' from source: unknown 18714 1726853431.86865: variable 'ansible_timeout' from source: unknown 18714 1726853431.86869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.86974: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853431.86984: variable 'omit' from source: magic vars 18714 1726853431.86989: starting attempt loop 18714 1726853431.86992: running the handler 18714 1726853431.87092: variable '__network_connections_result' from source: set_fact 18714 1726853431.87131: handler run complete 18714 1726853431.87144: attempt loop complete, returning result 18714 1726853431.87148: _execute() done 18714 1726853431.87151: dumping result to json 18714 1726853431.87153: done dumping result, returning 18714 1726853431.87166: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-e784-4f7d-00000000004e] 18714 1726853431.87169: sending task result for task 02083763-bbaf-e784-4f7d-00000000004e 18714 1726853431.87250: done sending task result for task 02083763-bbaf-e784-4f7d-00000000004e 18714 1726853431.87253: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 18714 1726853431.87315: no more pending results, returning what we have 18714 1726853431.87317: results queue empty 18714 1726853431.87318: checking for any_errors_fatal 18714 1726853431.87327: done checking for any_errors_fatal 18714 1726853431.87327: checking for max_fail_percentage 18714 1726853431.87329: done checking for max_fail_percentage 18714 1726853431.87330: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.87331: done checking to see if all hosts have failed 18714 1726853431.87332: getting the remaining hosts for this loop 18714 1726853431.87333: done getting the remaining hosts for this loop 18714 1726853431.87336: getting the next task for host managed_node1 18714 1726853431.87343: done getting next task for host managed_node1 18714 1726853431.87346: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18714 1726853431.87348: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.87357: getting variables 18714 1726853431.87359: in VariableManager get_vars() 18714 1726853431.87394: Calling all_inventory to load vars for managed_node1 18714 1726853431.87397: Calling groups_inventory to load vars for managed_node1 18714 1726853431.87399: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.87407: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.87410: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.87412: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.88555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.89858: done with get_vars() 18714 1726853431.89884: done getting variables 18714 1726853431.89946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:30:31 -0400 (0:00:00.044) 0:00:28.283 ****** 18714 1726853431.89975: entering _queue_task() for managed_node1/debug 18714 1726853431.90242: worker is 1 (out of 1 available) 18714 1726853431.90257: exiting _queue_task() for managed_node1/debug 18714 1726853431.90272: done queuing things up, now waiting for results queue to drain 18714 1726853431.90273: waiting for pending results... 18714 1726853431.90517: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18714 1726853431.90587: in run() - task 02083763-bbaf-e784-4f7d-00000000004f 18714 1726853431.90598: variable 'ansible_search_path' from source: unknown 18714 1726853431.90600: variable 'ansible_search_path' from source: unknown 18714 1726853431.90631: calling self._execute() 18714 1726853431.90717: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.90721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.90730: variable 'omit' from source: magic vars 18714 1726853431.91087: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.91110: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.91114: variable 'omit' from source: magic vars 18714 1726853431.91144: variable 'omit' from source: magic vars 18714 1726853431.91172: variable 'omit' from source: magic vars 18714 1726853431.91209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853431.91238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853431.91263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853431.91305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.91309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.91346: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853431.91350: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.91355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.91419: Set connection var ansible_shell_executable to /bin/sh 18714 1726853431.91429: Set connection var ansible_timeout to 10 18714 1726853431.91432: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853431.91439: Set connection var ansible_connection to ssh 18714 1726853431.91441: Set connection var ansible_shell_type to sh 18714 1726853431.91447: Set connection var ansible_pipelining to False 18714 1726853431.91466: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.91469: variable 'ansible_connection' from source: unknown 18714 1726853431.91473: variable 'ansible_module_compression' from source: unknown 18714 1726853431.91476: variable 'ansible_shell_type' from source: unknown 18714 1726853431.91478: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.91480: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.91482: variable 'ansible_pipelining' from source: unknown 18714 1726853431.91484: variable 'ansible_timeout' from source: unknown 18714 1726853431.91492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.91602: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853431.91606: variable 'omit' from source: magic vars 18714 1726853431.91612: starting attempt loop 18714 1726853431.91615: running the handler 18714 1726853431.91659: variable '__network_connections_result' from source: set_fact 18714 1726853431.91724: variable '__network_connections_result' from source: set_fact 18714 1726853431.91797: handler run complete 18714 1726853431.91816: attempt loop complete, returning result 18714 1726853431.91819: _execute() done 18714 1726853431.91822: dumping result to json 18714 1726853431.91824: done dumping result, returning 18714 1726853431.91830: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-e784-4f7d-00000000004f] 18714 1726853431.91849: sending task result for task 02083763-bbaf-e784-4f7d-00000000004f 18714 1726853431.91943: done sending task result for task 02083763-bbaf-e784-4f7d-00000000004f 18714 1726853431.91946: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18714 1726853431.92044: no more pending results, returning what we have 18714 1726853431.92047: results queue empty 18714 1726853431.92048: checking for any_errors_fatal 18714 1726853431.92055: done checking for any_errors_fatal 18714 1726853431.92056: checking for max_fail_percentage 18714 1726853431.92057: done checking for max_fail_percentage 18714 1726853431.92058: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.92058: done checking to see if all hosts have failed 18714 1726853431.92059: getting the remaining hosts for this loop 18714 1726853431.92060: done getting the remaining hosts for this loop 18714 1726853431.92063: getting the next task for host managed_node1 18714 1726853431.92069: done getting next task for host managed_node1 18714 1726853431.92074: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18714 1726853431.92075: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.92084: getting variables 18714 1726853431.92085: in VariableManager get_vars() 18714 1726853431.92116: Calling all_inventory to load vars for managed_node1 18714 1726853431.92118: Calling groups_inventory to load vars for managed_node1 18714 1726853431.92120: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.92128: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.92130: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.92133: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.93054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.94035: done with get_vars() 18714 1726853431.94057: done getting variables 18714 1726853431.94104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:30:31 -0400 (0:00:00.041) 0:00:28.324 ****** 18714 1726853431.94131: entering _queue_task() for managed_node1/debug 18714 1726853431.94392: worker is 1 (out of 1 available) 18714 1726853431.94406: exiting _queue_task() for managed_node1/debug 18714 1726853431.94419: done queuing things up, now waiting for results queue to drain 18714 1726853431.94421: waiting for pending results... 18714 1726853431.94648: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18714 1726853431.94740: in run() - task 02083763-bbaf-e784-4f7d-000000000050 18714 1726853431.94756: variable 'ansible_search_path' from source: unknown 18714 1726853431.94760: variable 'ansible_search_path' from source: unknown 18714 1726853431.94791: calling self._execute() 18714 1726853431.94863: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.94868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.94879: variable 'omit' from source: magic vars 18714 1726853431.95166: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.95176: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.95268: variable 'network_state' from source: role '' defaults 18714 1726853431.95279: Evaluated conditional (network_state != {}): False 18714 1726853431.95282: when evaluation is False, skipping this task 18714 1726853431.95284: _execute() done 18714 1726853431.95289: dumping result to json 18714 1726853431.95291: done dumping result, returning 18714 1726853431.95298: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-e784-4f7d-000000000050] 18714 1726853431.95302: sending task result for task 02083763-bbaf-e784-4f7d-000000000050 18714 1726853431.95388: done sending task result for task 02083763-bbaf-e784-4f7d-000000000050 18714 1726853431.95391: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18714 1726853431.95459: no more pending results, returning what we have 18714 1726853431.95463: results queue empty 18714 1726853431.95464: checking for any_errors_fatal 18714 1726853431.95476: done checking for any_errors_fatal 18714 1726853431.95477: checking for max_fail_percentage 18714 1726853431.95479: done checking for max_fail_percentage 18714 1726853431.95480: checking to see if all hosts have failed and the running result is not ok 18714 1726853431.95480: done checking to see if all hosts have failed 18714 1726853431.95481: getting the remaining hosts for this loop 18714 1726853431.95482: done getting the remaining hosts for this loop 18714 1726853431.95486: getting the next task for host managed_node1 18714 1726853431.95492: done getting next task for host managed_node1 18714 1726853431.95496: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18714 1726853431.95498: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853431.95514: getting variables 18714 1726853431.95516: in VariableManager get_vars() 18714 1726853431.95546: Calling all_inventory to load vars for managed_node1 18714 1726853431.95548: Calling groups_inventory to load vars for managed_node1 18714 1726853431.95550: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853431.95560: Calling all_plugins_play to load vars for managed_node1 18714 1726853431.95562: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853431.95565: Calling groups_plugins_play to load vars for managed_node1 18714 1726853431.96459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853431.97609: done with get_vars() 18714 1726853431.97625: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:30:31 -0400 (0:00:00.035) 0:00:28.360 ****** 18714 1726853431.97728: entering _queue_task() for managed_node1/ping 18714 1726853431.98080: worker is 1 (out of 1 available) 18714 1726853431.98098: exiting _queue_task() for managed_node1/ping 18714 1726853431.98109: done queuing things up, now waiting for results queue to drain 18714 1726853431.98110: waiting for pending results... 18714 1726853431.98299: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18714 1726853431.98401: in run() - task 02083763-bbaf-e784-4f7d-000000000051 18714 1726853431.98407: variable 'ansible_search_path' from source: unknown 18714 1726853431.98410: variable 'ansible_search_path' from source: unknown 18714 1726853431.98444: calling self._execute() 18714 1726853431.98531: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.98534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.98548: variable 'omit' from source: magic vars 18714 1726853431.98869: variable 'ansible_distribution_major_version' from source: facts 18714 1726853431.98897: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853431.98900: variable 'omit' from source: magic vars 18714 1726853431.98930: variable 'omit' from source: magic vars 18714 1726853431.98966: variable 'omit' from source: magic vars 18714 1726853431.99005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853431.99036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853431.99063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853431.99083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.99087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853431.99116: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853431.99121: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.99124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.99217: Set connection var ansible_shell_executable to /bin/sh 18714 1726853431.99222: Set connection var ansible_timeout to 10 18714 1726853431.99227: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853431.99247: Set connection var ansible_connection to ssh 18714 1726853431.99250: Set connection var ansible_shell_type to sh 18714 1726853431.99254: Set connection var ansible_pipelining to False 18714 1726853431.99282: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.99285: variable 'ansible_connection' from source: unknown 18714 1726853431.99288: variable 'ansible_module_compression' from source: unknown 18714 1726853431.99290: variable 'ansible_shell_type' from source: unknown 18714 1726853431.99292: variable 'ansible_shell_executable' from source: unknown 18714 1726853431.99294: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853431.99296: variable 'ansible_pipelining' from source: unknown 18714 1726853431.99298: variable 'ansible_timeout' from source: unknown 18714 1726853431.99299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853431.99477: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853431.99482: variable 'omit' from source: magic vars 18714 1726853431.99505: starting attempt loop 18714 1726853431.99514: running the handler 18714 1726853431.99517: _low_level_execute_command(): starting 18714 1726853431.99520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853432.00075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.00079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853432.00088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.00130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853432.00133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.00136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.00187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.01862: stdout chunk (state=3): >>>/root <<< 18714 1726853432.01966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.02010: stderr chunk (state=3): >>><<< 18714 1726853432.02015: stdout chunk (state=3): >>><<< 18714 1726853432.02044: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.02060: _low_level_execute_command(): starting 18714 1726853432.02066: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054 `" && echo ansible-tmp-1726853432.0204587-20014-216788639366054="` echo /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054 `" ) && sleep 0' 18714 1726853432.02620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.02623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853432.02626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.02637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.02640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.02677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.02706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.02778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.04662: stdout chunk (state=3): >>>ansible-tmp-1726853432.0204587-20014-216788639366054=/root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054 <<< 18714 1726853432.04767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.04798: stderr chunk (state=3): >>><<< 18714 1726853432.04801: stdout chunk (state=3): >>><<< 18714 1726853432.04816: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853432.0204587-20014-216788639366054=/root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.04856: variable 'ansible_module_compression' from source: unknown 18714 1726853432.04894: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18714 1726853432.04926: variable 'ansible_facts' from source: unknown 18714 1726853432.04975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py 18714 1726853432.05080: Sending initial data 18714 1726853432.05083: Sent initial data (153 bytes) 18714 1726853432.05544: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.05629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853432.05633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.05637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.05640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.05642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.05684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.05716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.07278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853432.07335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853432.07386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp_kpftx8v /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py <<< 18714 1726853432.07391: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py" <<< 18714 1726853432.07506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18714 1726853432.07510: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp_kpftx8v" to remote "/root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py" <<< 18714 1726853432.08192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.08234: stderr chunk (state=3): >>><<< 18714 1726853432.08237: stdout chunk (state=3): >>><<< 18714 1726853432.08264: done transferring module to remote 18714 1726853432.08268: _low_level_execute_command(): starting 18714 1726853432.08279: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/ /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py && sleep 0' 18714 1726853432.08776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.08779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853432.08781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.08783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.08851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.08892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.10747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.10756: stdout chunk (state=3): >>><<< 18714 1726853432.10759: stderr chunk (state=3): >>><<< 18714 1726853432.10877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.10880: _low_level_execute_command(): starting 18714 1726853432.10883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/AnsiballZ_ping.py && sleep 0' 18714 1726853432.11575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.11633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853432.11644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.11716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.11804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.26817: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18714 1726853432.28081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853432.28109: stderr chunk (state=3): >>><<< 18714 1726853432.28112: stdout chunk (state=3): >>><<< 18714 1726853432.28130: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853432.28151: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853432.28163: _low_level_execute_command(): starting 18714 1726853432.28167: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853432.0204587-20014-216788639366054/ > /dev/null 2>&1 && sleep 0' 18714 1726853432.28616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.28620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853432.28622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853432.28624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.28626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.28679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.28684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.28725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.30517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.30545: stderr chunk (state=3): >>><<< 18714 1726853432.30548: stdout chunk (state=3): >>><<< 18714 1726853432.30562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.30569: handler run complete 18714 1726853432.30582: attempt loop complete, returning result 18714 1726853432.30585: _execute() done 18714 1726853432.30587: dumping result to json 18714 1726853432.30590: done dumping result, returning 18714 1726853432.30598: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-e784-4f7d-000000000051] 18714 1726853432.30603: sending task result for task 02083763-bbaf-e784-4f7d-000000000051 18714 1726853432.30692: done sending task result for task 02083763-bbaf-e784-4f7d-000000000051 18714 1726853432.30695: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18714 1726853432.30767: no more pending results, returning what we have 18714 1726853432.30770: results queue empty 18714 1726853432.30773: checking for any_errors_fatal 18714 1726853432.30780: done checking for any_errors_fatal 18714 1726853432.30781: checking for max_fail_percentage 18714 1726853432.30783: done checking for max_fail_percentage 18714 1726853432.30783: checking to see if all hosts have failed and the running result is not ok 18714 1726853432.30784: done checking to see if all hosts have failed 18714 1726853432.30785: getting the remaining hosts for this loop 18714 1726853432.30787: done getting the remaining hosts for this loop 18714 1726853432.30790: getting the next task for host managed_node1 18714 1726853432.30798: done getting next task for host managed_node1 18714 1726853432.30800: ^ task is: TASK: meta (role_complete) 18714 1726853432.30801: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853432.30812: getting variables 18714 1726853432.30814: in VariableManager get_vars() 18714 1726853432.30851: Calling all_inventory to load vars for managed_node1 18714 1726853432.30853: Calling groups_inventory to load vars for managed_node1 18714 1726853432.30856: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853432.30865: Calling all_plugins_play to load vars for managed_node1 18714 1726853432.30867: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853432.30870: Calling groups_plugins_play to load vars for managed_node1 18714 1726853432.31721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853432.32626: done with get_vars() 18714 1726853432.32644: done getting variables 18714 1726853432.32703: done queuing things up, now waiting for results queue to drain 18714 1726853432.32705: results queue empty 18714 1726853432.32706: checking for any_errors_fatal 18714 1726853432.32708: done checking for any_errors_fatal 18714 1726853432.32708: checking for max_fail_percentage 18714 1726853432.32709: done checking for max_fail_percentage 18714 1726853432.32709: checking to see if all hosts have failed and the running result is not ok 18714 1726853432.32710: done checking to see if all hosts have failed 18714 1726853432.32710: getting the remaining hosts for this loop 18714 1726853432.32711: done getting the remaining hosts for this loop 18714 1726853432.32713: getting the next task for host managed_node1 18714 1726853432.32716: done getting next task for host managed_node1 18714 1726853432.32718: ^ task is: TASK: meta (flush_handlers) 18714 1726853432.32719: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853432.32721: getting variables 18714 1726853432.32722: in VariableManager get_vars() 18714 1726853432.32731: Calling all_inventory to load vars for managed_node1 18714 1726853432.32732: Calling groups_inventory to load vars for managed_node1 18714 1726853432.32733: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853432.32737: Calling all_plugins_play to load vars for managed_node1 18714 1726853432.32738: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853432.32740: Calling groups_plugins_play to load vars for managed_node1 18714 1726853432.33478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853432.34362: done with get_vars() 18714 1726853432.34378: done getting variables 18714 1726853432.34410: in VariableManager get_vars() 18714 1726853432.34418: Calling all_inventory to load vars for managed_node1 18714 1726853432.34419: Calling groups_inventory to load vars for managed_node1 18714 1726853432.34421: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853432.34424: Calling all_plugins_play to load vars for managed_node1 18714 1726853432.34425: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853432.34427: Calling groups_plugins_play to load vars for managed_node1 18714 1726853432.35070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853432.35944: done with get_vars() 18714 1726853432.35963: done queuing things up, now waiting for results queue to drain 18714 1726853432.35965: results queue empty 18714 1726853432.35965: checking for any_errors_fatal 18714 1726853432.35967: done checking for any_errors_fatal 18714 1726853432.35967: checking for max_fail_percentage 18714 1726853432.35968: done checking for max_fail_percentage 18714 1726853432.35969: checking to see if all hosts have failed and the running result is not ok 18714 1726853432.35970: done checking to see if all hosts have failed 18714 1726853432.35972: getting the remaining hosts for this loop 18714 1726853432.35973: done getting the remaining hosts for this loop 18714 1726853432.35975: getting the next task for host managed_node1 18714 1726853432.35978: done getting next task for host managed_node1 18714 1726853432.35978: ^ task is: TASK: meta (flush_handlers) 18714 1726853432.35980: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853432.35981: getting variables 18714 1726853432.35982: in VariableManager get_vars() 18714 1726853432.35989: Calling all_inventory to load vars for managed_node1 18714 1726853432.35990: Calling groups_inventory to load vars for managed_node1 18714 1726853432.35991: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853432.35995: Calling all_plugins_play to load vars for managed_node1 18714 1726853432.35996: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853432.35998: Calling groups_plugins_play to load vars for managed_node1 18714 1726853432.36670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853432.37527: done with get_vars() 18714 1726853432.37541: done getting variables 18714 1726853432.37575: in VariableManager get_vars() 18714 1726853432.37583: Calling all_inventory to load vars for managed_node1 18714 1726853432.37585: Calling groups_inventory to load vars for managed_node1 18714 1726853432.37586: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853432.37589: Calling all_plugins_play to load vars for managed_node1 18714 1726853432.37591: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853432.37594: Calling groups_plugins_play to load vars for managed_node1 18714 1726853432.38218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853432.39097: done with get_vars() 18714 1726853432.39115: done queuing things up, now waiting for results queue to drain 18714 1726853432.39117: results queue empty 18714 1726853432.39117: checking for any_errors_fatal 18714 1726853432.39118: done checking for any_errors_fatal 18714 1726853432.39119: checking for max_fail_percentage 18714 1726853432.39119: done checking for max_fail_percentage 18714 1726853432.39120: checking to see if all hosts have failed and the running result is not ok 18714 1726853432.39120: done checking to see if all hosts have failed 18714 1726853432.39121: getting the remaining hosts for this loop 18714 1726853432.39121: done getting the remaining hosts for this loop 18714 1726853432.39123: getting the next task for host managed_node1 18714 1726853432.39125: done getting next task for host managed_node1 18714 1726853432.39126: ^ task is: None 18714 1726853432.39127: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853432.39127: done queuing things up, now waiting for results queue to drain 18714 1726853432.39128: results queue empty 18714 1726853432.39128: checking for any_errors_fatal 18714 1726853432.39129: done checking for any_errors_fatal 18714 1726853432.39129: checking for max_fail_percentage 18714 1726853432.39130: done checking for max_fail_percentage 18714 1726853432.39130: checking to see if all hosts have failed and the running result is not ok 18714 1726853432.39130: done checking to see if all hosts have failed 18714 1726853432.39131: getting the next task for host managed_node1 18714 1726853432.39132: done getting next task for host managed_node1 18714 1726853432.39133: ^ task is: None 18714 1726853432.39134: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853432.39173: in VariableManager get_vars() 18714 1726853432.39183: done with get_vars() 18714 1726853432.39187: in VariableManager get_vars() 18714 1726853432.39192: done with get_vars() 18714 1726853432.39195: variable 'omit' from source: magic vars 18714 1726853432.39215: in VariableManager get_vars() 18714 1726853432.39223: done with get_vars() 18714 1726853432.39237: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18714 1726853432.39396: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853432.39415: getting the remaining hosts for this loop 18714 1726853432.39416: done getting the remaining hosts for this loop 18714 1726853432.39418: getting the next task for host managed_node1 18714 1726853432.39419: done getting next task for host managed_node1 18714 1726853432.39421: ^ task is: TASK: Gathering Facts 18714 1726853432.39422: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853432.39423: getting variables 18714 1726853432.39423: in VariableManager get_vars() 18714 1726853432.39429: Calling all_inventory to load vars for managed_node1 18714 1726853432.39431: Calling groups_inventory to load vars for managed_node1 18714 1726853432.39433: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853432.39438: Calling all_plugins_play to load vars for managed_node1 18714 1726853432.39440: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853432.39441: Calling groups_plugins_play to load vars for managed_node1 18714 1726853432.40167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853432.41046: done with get_vars() 18714 1726853432.41062: done getting variables 18714 1726853432.41093: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 13:30:32 -0400 (0:00:00.433) 0:00:28.794 ****** 18714 1726853432.41111: entering _queue_task() for managed_node1/gather_facts 18714 1726853432.41362: worker is 1 (out of 1 available) 18714 1726853432.41376: exiting _queue_task() for managed_node1/gather_facts 18714 1726853432.41386: done queuing things up, now waiting for results queue to drain 18714 1726853432.41387: waiting for pending results... 18714 1726853432.41558: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853432.41625: in run() - task 02083763-bbaf-e784-4f7d-0000000003f8 18714 1726853432.41640: variable 'ansible_search_path' from source: unknown 18714 1726853432.41669: calling self._execute() 18714 1726853432.41744: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853432.41748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853432.41758: variable 'omit' from source: magic vars 18714 1726853432.42114: variable 'ansible_distribution_major_version' from source: facts 18714 1726853432.42118: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853432.42121: variable 'omit' from source: magic vars 18714 1726853432.42124: variable 'omit' from source: magic vars 18714 1726853432.42126: variable 'omit' from source: magic vars 18714 1726853432.42277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853432.42280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853432.42283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853432.42285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853432.42287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853432.42305: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853432.42313: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853432.42320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853432.42428: Set connection var ansible_shell_executable to /bin/sh 18714 1726853432.42441: Set connection var ansible_timeout to 10 18714 1726853432.42451: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853432.42463: Set connection var ansible_connection to ssh 18714 1726853432.42473: Set connection var ansible_shell_type to sh 18714 1726853432.42483: Set connection var ansible_pipelining to False 18714 1726853432.42509: variable 'ansible_shell_executable' from source: unknown 18714 1726853432.42517: variable 'ansible_connection' from source: unknown 18714 1726853432.42524: variable 'ansible_module_compression' from source: unknown 18714 1726853432.42530: variable 'ansible_shell_type' from source: unknown 18714 1726853432.42536: variable 'ansible_shell_executable' from source: unknown 18714 1726853432.42542: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853432.42549: variable 'ansible_pipelining' from source: unknown 18714 1726853432.42555: variable 'ansible_timeout' from source: unknown 18714 1726853432.42562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853432.42755: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853432.42774: variable 'omit' from source: magic vars 18714 1726853432.42786: starting attempt loop 18714 1726853432.42792: running the handler 18714 1726853432.42813: variable 'ansible_facts' from source: unknown 18714 1726853432.42839: _low_level_execute_command(): starting 18714 1726853432.42851: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853432.43583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853432.43599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.43614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.43631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853432.43648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853432.43660: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853432.43678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.43763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.43790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.43867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.45563: stdout chunk (state=3): >>>/root <<< 18714 1726853432.45664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.45699: stderr chunk (state=3): >>><<< 18714 1726853432.45703: stdout chunk (state=3): >>><<< 18714 1726853432.45725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.45737: _low_level_execute_command(): starting 18714 1726853432.45742: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119 `" && echo ansible-tmp-1726853432.4572496-20031-65176066138119="` echo /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119 `" ) && sleep 0' 18714 1726853432.46180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.46183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.46193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853432.46196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.46243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853432.46247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.46294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.48166: stdout chunk (state=3): >>>ansible-tmp-1726853432.4572496-20031-65176066138119=/root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119 <<< 18714 1726853432.48286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.48310: stderr chunk (state=3): >>><<< 18714 1726853432.48313: stdout chunk (state=3): >>><<< 18714 1726853432.48333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853432.4572496-20031-65176066138119=/root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.48377: variable 'ansible_module_compression' from source: unknown 18714 1726853432.48395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853432.48450: variable 'ansible_facts' from source: unknown 18714 1726853432.48580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py 18714 1726853432.48686: Sending initial data 18714 1726853432.48689: Sent initial data (153 bytes) 18714 1726853432.49105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853432.49108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.49111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853432.49113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.49165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853432.49170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.49206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.50747: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853432.50783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853432.50836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpcvpp2nmc /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py <<< 18714 1726853432.50839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py" <<< 18714 1726853432.50882: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpcvpp2nmc" to remote "/root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py" <<< 18714 1726853432.52279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.52322: stderr chunk (state=3): >>><<< 18714 1726853432.52335: stdout chunk (state=3): >>><<< 18714 1726853432.52358: done transferring module to remote 18714 1726853432.52386: _low_level_execute_command(): starting 18714 1726853432.52389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/ /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py && sleep 0' 18714 1726853432.53188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.53191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853432.53193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.53259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.53294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.53363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853432.55178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853432.55181: stdout chunk (state=3): >>><<< 18714 1726853432.55188: stderr chunk (state=3): >>><<< 18714 1726853432.55297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853432.55301: _low_level_execute_command(): starting 18714 1726853432.55303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/AnsiballZ_setup.py && sleep 0' 18714 1726853432.57153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853432.57465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853432.57513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853432.57732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.23142: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 599, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794852864, "block_size": 4096, "block_total": 65519099, "block_available": 63914759, "block_used": 1604340, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root<<< 18714 1726853434.23181: stdout chunk (state=3): >>>", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.30517578125, "5m": 0.3359375, "15m": 0.16845703125}, "ansible_interfaces": ["eth0", "peerlsr27", "lsr27", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "34", "epoch": "1726853434", "epoch_int": "1726853434", "date": "2024-09-20", "time": "13:30:34", "iso8601_micro": "2024-09-20T17:30:34.227741Z", "iso8601": "2024-09-20T17:30:34Z", "iso8601_basic": "20240920T133034227741", "iso8601_basic_short": "20240920T133034", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853434.25232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853434.25236: stdout chunk (state=3): >>><<< 18714 1726853434.25238: stderr chunk (state=3): >>><<< 18714 1726853434.25379: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 599, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794852864, "block_size": 4096, "block_total": 65519099, "block_available": 63914759, "block_used": 1604340, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.30517578125, "5m": 0.3359375, "15m": 0.16845703125}, "ansible_interfaces": ["eth0", "peerlsr27", "lsr27", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "b2:9a:3d:31:ac:3c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b09a:3dff:fe31:ac3c", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "f6:d4:d1:51:c0:bf", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f4d4:d1ff:fe51:c0bf", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::b09a:3dff:fe31:ac3c", "fe80::3a:e7ff:fe40:bc9f", "fe80::f4d4:d1ff:fe51:c0bf"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::b09a:3dff:fe31:ac3c", "fe80::f4d4:d1ff:fe51:c0bf"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "34", "epoch": "1726853434", "epoch_int": "1726853434", "date": "2024-09-20", "time": "13:30:34", "iso8601_micro": "2024-09-20T17:30:34.227741Z", "iso8601": "2024-09-20T17:30:34Z", "iso8601_basic": "20240920T133034227741", "iso8601_basic_short": "20240920T133034", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853434.25945: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853434.25976: _low_level_execute_command(): starting 18714 1726853434.25986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853432.4572496-20031-65176066138119/ > /dev/null 2>&1 && sleep 0' 18714 1726853434.26594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853434.26610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853434.26625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.26643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853434.26659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853434.26670: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853434.26687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853434.26705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853434.26793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853434.26808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853434.26833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853434.26855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.26928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.28769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853434.28838: stderr chunk (state=3): >>><<< 18714 1726853434.28862: stdout chunk (state=3): >>><<< 18714 1726853434.28887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853434.28903: handler run complete 18714 1726853434.29074: variable 'ansible_facts' from source: unknown 18714 1726853434.29221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.29849: variable 'ansible_facts' from source: unknown 18714 1726853434.30088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.30528: attempt loop complete, returning result 18714 1726853434.30538: _execute() done 18714 1726853434.30549: dumping result to json 18714 1726853434.30594: done dumping result, returning 18714 1726853434.30611: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-0000000003f8] 18714 1726853434.30620: sending task result for task 02083763-bbaf-e784-4f7d-0000000003f8 ok: [managed_node1] 18714 1726853434.31764: no more pending results, returning what we have 18714 1726853434.31767: results queue empty 18714 1726853434.31768: checking for any_errors_fatal 18714 1726853434.31769: done checking for any_errors_fatal 18714 1726853434.31770: checking for max_fail_percentage 18714 1726853434.31877: done checking for max_fail_percentage 18714 1726853434.31879: checking to see if all hosts have failed and the running result is not ok 18714 1726853434.31879: done checking to see if all hosts have failed 18714 1726853434.31880: getting the remaining hosts for this loop 18714 1726853434.31881: done getting the remaining hosts for this loop 18714 1726853434.31885: getting the next task for host managed_node1 18714 1726853434.31890: done getting next task for host managed_node1 18714 1726853434.31892: ^ task is: TASK: meta (flush_handlers) 18714 1726853434.31894: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853434.31898: getting variables 18714 1726853434.31899: in VariableManager get_vars() 18714 1726853434.31923: done sending task result for task 02083763-bbaf-e784-4f7d-0000000003f8 18714 1726853434.31925: WORKER PROCESS EXITING 18714 1726853434.31931: Calling all_inventory to load vars for managed_node1 18714 1726853434.31933: Calling groups_inventory to load vars for managed_node1 18714 1726853434.31936: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.31945: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.31948: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.31951: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.33367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.35080: done with get_vars() 18714 1726853434.35115: done getting variables 18714 1726853434.35192: in VariableManager get_vars() 18714 1726853434.35203: Calling all_inventory to load vars for managed_node1 18714 1726853434.35205: Calling groups_inventory to load vars for managed_node1 18714 1726853434.35208: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.35213: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.35220: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.35224: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.36401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.37849: done with get_vars() 18714 1726853434.37869: done queuing things up, now waiting for results queue to drain 18714 1726853434.37876: results queue empty 18714 1726853434.37877: checking for any_errors_fatal 18714 1726853434.37880: done checking for any_errors_fatal 18714 1726853434.37880: checking for max_fail_percentage 18714 1726853434.37881: done checking for max_fail_percentage 18714 1726853434.37881: checking to see if all hosts have failed and the running result is not ok 18714 1726853434.37882: done checking to see if all hosts have failed 18714 1726853434.37882: getting the remaining hosts for this loop 18714 1726853434.37883: done getting the remaining hosts for this loop 18714 1726853434.37885: getting the next task for host managed_node1 18714 1726853434.37888: done getting next task for host managed_node1 18714 1726853434.37890: ^ task is: TASK: Include the task 'delete_interface.yml' 18714 1726853434.37891: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853434.37893: getting variables 18714 1726853434.37893: in VariableManager get_vars() 18714 1726853434.37899: Calling all_inventory to load vars for managed_node1 18714 1726853434.37901: Calling groups_inventory to load vars for managed_node1 18714 1726853434.37902: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.37906: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.37907: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.37909: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.41873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.42856: done with get_vars() 18714 1726853434.42877: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 13:30:34 -0400 (0:00:02.018) 0:00:30.812 ****** 18714 1726853434.42946: entering _queue_task() for managed_node1/include_tasks 18714 1726853434.43343: worker is 1 (out of 1 available) 18714 1726853434.43357: exiting _queue_task() for managed_node1/include_tasks 18714 1726853434.43369: done queuing things up, now waiting for results queue to drain 18714 1726853434.43370: waiting for pending results... 18714 1726853434.43631: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 18714 1726853434.43882: in run() - task 02083763-bbaf-e784-4f7d-000000000054 18714 1726853434.43887: variable 'ansible_search_path' from source: unknown 18714 1726853434.43890: calling self._execute() 18714 1726853434.43954: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853434.43967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853434.43992: variable 'omit' from source: magic vars 18714 1726853434.44425: variable 'ansible_distribution_major_version' from source: facts 18714 1726853434.44444: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853434.44460: _execute() done 18714 1726853434.44469: dumping result to json 18714 1726853434.44481: done dumping result, returning 18714 1726853434.44492: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [02083763-bbaf-e784-4f7d-000000000054] 18714 1726853434.44501: sending task result for task 02083763-bbaf-e784-4f7d-000000000054 18714 1726853434.44650: no more pending results, returning what we have 18714 1726853434.44655: in VariableManager get_vars() 18714 1726853434.44691: Calling all_inventory to load vars for managed_node1 18714 1726853434.44694: Calling groups_inventory to load vars for managed_node1 18714 1726853434.44697: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.44711: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.44713: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.44716: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.45260: done sending task result for task 02083763-bbaf-e784-4f7d-000000000054 18714 1726853434.45263: WORKER PROCESS EXITING 18714 1726853434.46265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.48191: done with get_vars() 18714 1726853434.48210: variable 'ansible_search_path' from source: unknown 18714 1726853434.48225: we have included files to process 18714 1726853434.48227: generating all_blocks data 18714 1726853434.48229: done generating all_blocks data 18714 1726853434.48230: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18714 1726853434.48231: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18714 1726853434.48234: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18714 1726853434.48491: done processing included file 18714 1726853434.48493: iterating over new_blocks loaded from include file 18714 1726853434.48495: in VariableManager get_vars() 18714 1726853434.48507: done with get_vars() 18714 1726853434.48509: filtering new block on tags 18714 1726853434.48524: done filtering new block on tags 18714 1726853434.48526: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 18714 1726853434.48532: extending task lists for all hosts with included blocks 18714 1726853434.48566: done extending task lists 18714 1726853434.48567: done processing included files 18714 1726853434.48568: results queue empty 18714 1726853434.48569: checking for any_errors_fatal 18714 1726853434.48575: done checking for any_errors_fatal 18714 1726853434.48577: checking for max_fail_percentage 18714 1726853434.48582: done checking for max_fail_percentage 18714 1726853434.48583: checking to see if all hosts have failed and the running result is not ok 18714 1726853434.48583: done checking to see if all hosts have failed 18714 1726853434.48584: getting the remaining hosts for this loop 18714 1726853434.48585: done getting the remaining hosts for this loop 18714 1726853434.48588: getting the next task for host managed_node1 18714 1726853434.48592: done getting next task for host managed_node1 18714 1726853434.48594: ^ task is: TASK: Remove test interface if necessary 18714 1726853434.48596: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853434.48598: getting variables 18714 1726853434.48599: in VariableManager get_vars() 18714 1726853434.48607: Calling all_inventory to load vars for managed_node1 18714 1726853434.48610: Calling groups_inventory to load vars for managed_node1 18714 1726853434.48612: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.48617: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.48619: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.48622: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.50390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.54355: done with get_vars() 18714 1726853434.54384: done getting variables 18714 1726853434.54542: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:30:34 -0400 (0:00:00.117) 0:00:30.930 ****** 18714 1726853434.54698: entering _queue_task() for managed_node1/command 18714 1726853434.55131: worker is 1 (out of 1 available) 18714 1726853434.55144: exiting _queue_task() for managed_node1/command 18714 1726853434.55273: done queuing things up, now waiting for results queue to drain 18714 1726853434.55274: waiting for pending results... 18714 1726853434.55456: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 18714 1726853434.55628: in run() - task 02083763-bbaf-e784-4f7d-000000000409 18714 1726853434.55650: variable 'ansible_search_path' from source: unknown 18714 1726853434.55688: variable 'ansible_search_path' from source: unknown 18714 1726853434.55764: calling self._execute() 18714 1726853434.55885: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853434.55898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853434.55912: variable 'omit' from source: magic vars 18714 1726853434.56342: variable 'ansible_distribution_major_version' from source: facts 18714 1726853434.56377: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853434.56464: variable 'omit' from source: magic vars 18714 1726853434.56468: variable 'omit' from source: magic vars 18714 1726853434.56526: variable 'interface' from source: set_fact 18714 1726853434.56547: variable 'omit' from source: magic vars 18714 1726853434.56601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853434.56640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853434.56664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853434.56694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853434.56715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853434.56750: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853434.56761: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853434.56769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853434.56882: Set connection var ansible_shell_executable to /bin/sh 18714 1726853434.56917: Set connection var ansible_timeout to 10 18714 1726853434.56920: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853434.56930: Set connection var ansible_connection to ssh 18714 1726853434.56976: Set connection var ansible_shell_type to sh 18714 1726853434.56980: Set connection var ansible_pipelining to False 18714 1726853434.56982: variable 'ansible_shell_executable' from source: unknown 18714 1726853434.56984: variable 'ansible_connection' from source: unknown 18714 1726853434.56986: variable 'ansible_module_compression' from source: unknown 18714 1726853434.56987: variable 'ansible_shell_type' from source: unknown 18714 1726853434.56992: variable 'ansible_shell_executable' from source: unknown 18714 1726853434.57007: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853434.57023: variable 'ansible_pipelining' from source: unknown 18714 1726853434.57032: variable 'ansible_timeout' from source: unknown 18714 1726853434.57111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853434.57195: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853434.57205: variable 'omit' from source: magic vars 18714 1726853434.57208: starting attempt loop 18714 1726853434.57211: running the handler 18714 1726853434.57231: _low_level_execute_command(): starting 18714 1726853434.57234: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853434.58307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.58388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853434.58415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853434.58432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853434.58445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.58512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.60406: stdout chunk (state=3): >>>/root <<< 18714 1726853434.60418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853434.60497: stderr chunk (state=3): >>><<< 18714 1726853434.60501: stdout chunk (state=3): >>><<< 18714 1726853434.60504: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853434.60507: _low_level_execute_command(): starting 18714 1726853434.60607: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847 `" && echo ansible-tmp-1726853434.604771-20129-88049980053847="` echo /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847 `" ) && sleep 0' 18714 1726853434.61333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853434.61348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853434.61380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.61478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853434.61502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.61601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.63472: stdout chunk (state=3): >>>ansible-tmp-1726853434.604771-20129-88049980053847=/root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847 <<< 18714 1726853434.63769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853434.63775: stdout chunk (state=3): >>><<< 18714 1726853434.63778: stderr chunk (state=3): >>><<< 18714 1726853434.63781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853434.604771-20129-88049980053847=/root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853434.63785: variable 'ansible_module_compression' from source: unknown 18714 1726853434.63981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853434.63984: variable 'ansible_facts' from source: unknown 18714 1726853434.64145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py 18714 1726853434.64503: Sending initial data 18714 1726853434.64532: Sent initial data (154 bytes) 18714 1726853434.65490: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853434.65559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.65783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853434.65798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.66205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.67727: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18714 1726853434.67759: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18714 1726853434.67867: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853434.67894: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853434.67947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpf_bp6_ho /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py <<< 18714 1726853434.67951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py" <<< 18714 1726853434.67985: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpf_bp6_ho" to remote "/root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py" <<< 18714 1726853434.69180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853434.69235: stderr chunk (state=3): >>><<< 18714 1726853434.69247: stdout chunk (state=3): >>><<< 18714 1726853434.69320: done transferring module to remote 18714 1726853434.69407: _low_level_execute_command(): starting 18714 1726853434.69410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/ /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py && sleep 0' 18714 1726853434.70065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853434.70101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853434.70115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853434.70130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.70206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.72007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853434.72048: stderr chunk (state=3): >>><<< 18714 1726853434.72051: stdout chunk (state=3): >>><<< 18714 1726853434.72066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853434.72075: _low_level_execute_command(): starting 18714 1726853434.72077: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/AnsiballZ_command.py && sleep 0' 18714 1726853434.72492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.72496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853434.72499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.72501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853434.72541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853434.72544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.72601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.88878: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 13:30:34.875689", "end": "2024-09-20 13:30:34.886274", "delta": "0:00:00.010585", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853434.91002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853434.91006: stdout chunk (state=3): >>><<< 18714 1726853434.91008: stderr chunk (state=3): >>><<< 18714 1726853434.91082: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 13:30:34.875689", "end": "2024-09-20 13:30:34.886274", "delta": "0:00:00.010585", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853434.91087: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853434.91089: _low_level_execute_command(): starting 18714 1726853434.91091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853434.604771-20129-88049980053847/ > /dev/null 2>&1 && sleep 0' 18714 1726853434.91719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853434.91742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853434.91785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853434.91845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853434.91949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853434.91986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853434.93902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853434.93905: stdout chunk (state=3): >>><<< 18714 1726853434.93909: stderr chunk (state=3): >>><<< 18714 1726853434.93925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853434.94076: handler run complete 18714 1726853434.94079: Evaluated conditional (False): False 18714 1726853434.94081: attempt loop complete, returning result 18714 1726853434.94084: _execute() done 18714 1726853434.94086: dumping result to json 18714 1726853434.94087: done dumping result, returning 18714 1726853434.94089: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [02083763-bbaf-e784-4f7d-000000000409] 18714 1726853434.94091: sending task result for task 02083763-bbaf-e784-4f7d-000000000409 18714 1726853434.94169: done sending task result for task 02083763-bbaf-e784-4f7d-000000000409 18714 1726853434.94174: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.010585", "end": "2024-09-20 13:30:34.886274", "rc": 0, "start": "2024-09-20 13:30:34.875689" } 18714 1726853434.94242: no more pending results, returning what we have 18714 1726853434.94251: results queue empty 18714 1726853434.94253: checking for any_errors_fatal 18714 1726853434.94255: done checking for any_errors_fatal 18714 1726853434.94255: checking for max_fail_percentage 18714 1726853434.94257: done checking for max_fail_percentage 18714 1726853434.94258: checking to see if all hosts have failed and the running result is not ok 18714 1726853434.94259: done checking to see if all hosts have failed 18714 1726853434.94259: getting the remaining hosts for this loop 18714 1726853434.94261: done getting the remaining hosts for this loop 18714 1726853434.94264: getting the next task for host managed_node1 18714 1726853434.94283: done getting next task for host managed_node1 18714 1726853434.94287: ^ task is: TASK: meta (flush_handlers) 18714 1726853434.94289: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853434.94294: getting variables 18714 1726853434.94297: in VariableManager get_vars() 18714 1726853434.94331: Calling all_inventory to load vars for managed_node1 18714 1726853434.94334: Calling groups_inventory to load vars for managed_node1 18714 1726853434.94338: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.94353: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.94357: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.94361: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.96288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853434.97837: done with get_vars() 18714 1726853434.97860: done getting variables 18714 1726853434.97931: in VariableManager get_vars() 18714 1726853434.97941: Calling all_inventory to load vars for managed_node1 18714 1726853434.97943: Calling groups_inventory to load vars for managed_node1 18714 1726853434.97945: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853434.97950: Calling all_plugins_play to load vars for managed_node1 18714 1726853434.97952: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853434.97955: Calling groups_plugins_play to load vars for managed_node1 18714 1726853434.99046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853435.00646: done with get_vars() 18714 1726853435.00679: done queuing things up, now waiting for results queue to drain 18714 1726853435.00681: results queue empty 18714 1726853435.00682: checking for any_errors_fatal 18714 1726853435.00686: done checking for any_errors_fatal 18714 1726853435.00687: checking for max_fail_percentage 18714 1726853435.00688: done checking for max_fail_percentage 18714 1726853435.00688: checking to see if all hosts have failed and the running result is not ok 18714 1726853435.00689: done checking to see if all hosts have failed 18714 1726853435.00690: getting the remaining hosts for this loop 18714 1726853435.00690: done getting the remaining hosts for this loop 18714 1726853435.00693: getting the next task for host managed_node1 18714 1726853435.00697: done getting next task for host managed_node1 18714 1726853435.00698: ^ task is: TASK: meta (flush_handlers) 18714 1726853435.00700: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853435.00702: getting variables 18714 1726853435.00703: in VariableManager get_vars() 18714 1726853435.00712: Calling all_inventory to load vars for managed_node1 18714 1726853435.00714: Calling groups_inventory to load vars for managed_node1 18714 1726853435.00716: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853435.00723: Calling all_plugins_play to load vars for managed_node1 18714 1726853435.00725: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853435.00728: Calling groups_plugins_play to load vars for managed_node1 18714 1726853435.01944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853435.03536: done with get_vars() 18714 1726853435.03560: done getting variables 18714 1726853435.03613: in VariableManager get_vars() 18714 1726853435.03625: Calling all_inventory to load vars for managed_node1 18714 1726853435.03628: Calling groups_inventory to load vars for managed_node1 18714 1726853435.03630: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853435.03635: Calling all_plugins_play to load vars for managed_node1 18714 1726853435.03637: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853435.03640: Calling groups_plugins_play to load vars for managed_node1 18714 1726853435.04721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853435.06312: done with get_vars() 18714 1726853435.06340: done queuing things up, now waiting for results queue to drain 18714 1726853435.06342: results queue empty 18714 1726853435.06343: checking for any_errors_fatal 18714 1726853435.06344: done checking for any_errors_fatal 18714 1726853435.06345: checking for max_fail_percentage 18714 1726853435.06346: done checking for max_fail_percentage 18714 1726853435.06347: checking to see if all hosts have failed and the running result is not ok 18714 1726853435.06348: done checking to see if all hosts have failed 18714 1726853435.06348: getting the remaining hosts for this loop 18714 1726853435.06349: done getting the remaining hosts for this loop 18714 1726853435.06355: getting the next task for host managed_node1 18714 1726853435.06358: done getting next task for host managed_node1 18714 1726853435.06359: ^ task is: None 18714 1726853435.06361: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853435.06362: done queuing things up, now waiting for results queue to drain 18714 1726853435.06363: results queue empty 18714 1726853435.06364: checking for any_errors_fatal 18714 1726853435.06364: done checking for any_errors_fatal 18714 1726853435.06365: checking for max_fail_percentage 18714 1726853435.06366: done checking for max_fail_percentage 18714 1726853435.06367: checking to see if all hosts have failed and the running result is not ok 18714 1726853435.06367: done checking to see if all hosts have failed 18714 1726853435.06368: getting the next task for host managed_node1 18714 1726853435.06377: done getting next task for host managed_node1 18714 1726853435.06378: ^ task is: None 18714 1726853435.06379: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853435.06428: in VariableManager get_vars() 18714 1726853435.06450: done with get_vars() 18714 1726853435.06460: in VariableManager get_vars() 18714 1726853435.06478: done with get_vars() 18714 1726853435.06484: variable 'omit' from source: magic vars 18714 1726853435.06611: variable 'profile' from source: play vars 18714 1726853435.06713: in VariableManager get_vars() 18714 1726853435.06726: done with get_vars() 18714 1726853435.06747: variable 'omit' from source: magic vars 18714 1726853435.06819: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18714 1726853435.07512: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853435.07536: getting the remaining hosts for this loop 18714 1726853435.07537: done getting the remaining hosts for this loop 18714 1726853435.07540: getting the next task for host managed_node1 18714 1726853435.07542: done getting next task for host managed_node1 18714 1726853435.07544: ^ task is: TASK: Gathering Facts 18714 1726853435.07545: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853435.07547: getting variables 18714 1726853435.07548: in VariableManager get_vars() 18714 1726853435.07562: Calling all_inventory to load vars for managed_node1 18714 1726853435.07564: Calling groups_inventory to load vars for managed_node1 18714 1726853435.07566: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853435.07575: Calling all_plugins_play to load vars for managed_node1 18714 1726853435.07578: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853435.07582: Calling groups_plugins_play to load vars for managed_node1 18714 1726853435.08885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853435.10488: done with get_vars() 18714 1726853435.10508: done getting variables 18714 1726853435.10550: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:30:35 -0400 (0:00:00.558) 0:00:31.489 ****** 18714 1726853435.10584: entering _queue_task() for managed_node1/gather_facts 18714 1726853435.10920: worker is 1 (out of 1 available) 18714 1726853435.10935: exiting _queue_task() for managed_node1/gather_facts 18714 1726853435.10949: done queuing things up, now waiting for results queue to drain 18714 1726853435.10950: waiting for pending results... 18714 1726853435.11385: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853435.11390: in run() - task 02083763-bbaf-e784-4f7d-000000000417 18714 1726853435.11393: variable 'ansible_search_path' from source: unknown 18714 1726853435.11403: calling self._execute() 18714 1726853435.11518: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853435.11530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853435.11543: variable 'omit' from source: magic vars 18714 1726853435.11948: variable 'ansible_distribution_major_version' from source: facts 18714 1726853435.11968: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853435.11983: variable 'omit' from source: magic vars 18714 1726853435.12017: variable 'omit' from source: magic vars 18714 1726853435.12068: variable 'omit' from source: magic vars 18714 1726853435.12113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853435.12164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853435.12194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853435.12216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853435.12235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853435.12280: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853435.12289: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853435.12299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853435.12486: Set connection var ansible_shell_executable to /bin/sh 18714 1726853435.12489: Set connection var ansible_timeout to 10 18714 1726853435.12492: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853435.12494: Set connection var ansible_connection to ssh 18714 1726853435.12496: Set connection var ansible_shell_type to sh 18714 1726853435.12497: Set connection var ansible_pipelining to False 18714 1726853435.12500: variable 'ansible_shell_executable' from source: unknown 18714 1726853435.12502: variable 'ansible_connection' from source: unknown 18714 1726853435.12503: variable 'ansible_module_compression' from source: unknown 18714 1726853435.12508: variable 'ansible_shell_type' from source: unknown 18714 1726853435.12516: variable 'ansible_shell_executable' from source: unknown 18714 1726853435.12523: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853435.12531: variable 'ansible_pipelining' from source: unknown 18714 1726853435.12539: variable 'ansible_timeout' from source: unknown 18714 1726853435.12548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853435.12743: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853435.12762: variable 'omit' from source: magic vars 18714 1726853435.12774: starting attempt loop 18714 1726853435.12781: running the handler 18714 1726853435.12802: variable 'ansible_facts' from source: unknown 18714 1726853435.12921: _low_level_execute_command(): starting 18714 1726853435.12924: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853435.13688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853435.13742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853435.13767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853435.13794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853435.13878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853435.15565: stdout chunk (state=3): >>>/root <<< 18714 1726853435.15727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853435.15814: stderr chunk (state=3): >>><<< 18714 1726853435.15817: stdout chunk (state=3): >>><<< 18714 1726853435.15845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853435.15954: _low_level_execute_command(): starting 18714 1726853435.15958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824 `" && echo ansible-tmp-1726853435.1587214-20161-280845475004824="` echo /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824 `" ) && sleep 0' 18714 1726853435.16527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853435.16557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853435.16638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853435.18505: stdout chunk (state=3): >>>ansible-tmp-1726853435.1587214-20161-280845475004824=/root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824 <<< 18714 1726853435.18685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853435.18735: stdout chunk (state=3): >>><<< 18714 1726853435.18747: stderr chunk (state=3): >>><<< 18714 1726853435.18774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853435.1587214-20161-280845475004824=/root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853435.18806: variable 'ansible_module_compression' from source: unknown 18714 1726853435.19050: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853435.19090: variable 'ansible_facts' from source: unknown 18714 1726853435.19397: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py 18714 1726853435.19628: Sending initial data 18714 1726853435.19631: Sent initial data (154 bytes) 18714 1726853435.20891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853435.21020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853435.21089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853435.22618: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853435.22678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853435.22743: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpr6ekr1eq /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py <<< 18714 1726853435.22774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py" <<< 18714 1726853435.22814: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpr6ekr1eq" to remote "/root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py" <<< 18714 1726853435.24943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853435.24960: stdout chunk (state=3): >>><<< 18714 1726853435.24985: stderr chunk (state=3): >>><<< 18714 1726853435.25176: done transferring module to remote 18714 1726853435.25180: _low_level_execute_command(): starting 18714 1726853435.25182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/ /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py && sleep 0' 18714 1726853435.26322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853435.26405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853435.26557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853435.26593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853435.28593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853435.28599: stdout chunk (state=3): >>><<< 18714 1726853435.28602: stderr chunk (state=3): >>><<< 18714 1726853435.28605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853435.28612: _low_level_execute_command(): starting 18714 1726853435.28615: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/AnsiballZ_setup.py && sleep 0' 18714 1726853435.29827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853435.30188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853435.30248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853435.30393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853435.30408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853435.30517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853435.95617: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "35", "epoch": "1726853435", "epoch_int": "1726853435", "date": "2024-09-20", "time": "13:30:35", "iso8601_micro": "2024-09-20T17:30:35.581303Z", "iso8601": "2024-09-20T17:30:35Z", "iso8601_basic": "20240920T133035581303", "iso8601_basic_short": "20240920T133035", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.30517578125, "5m": 0.3359375, "15m": 0.16845703125}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2946, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 585, "free": 2946}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 601, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794852864, "block_size": 4096, "block_total": 65519099, "block_available": 63914759, "block_used": 1604340, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853435.97448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853435.97452: stdout chunk (state=3): >>><<< 18714 1726853435.97454: stderr chunk (state=3): >>><<< 18714 1726853435.97678: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "35", "epoch": "1726853435", "epoch_int": "1726853435", "date": "2024-09-20", "time": "13:30:35", "iso8601_micro": "2024-09-20T17:30:35.581303Z", "iso8601": "2024-09-20T17:30:35Z", "iso8601_basic": "20240920T133035581303", "iso8601_basic_short": "20240920T133035", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.30517578125, "5m": 0.3359375, "15m": 0.16845703125}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2946, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 585, "free": 2946}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 601, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794852864, "block_size": 4096, "block_total": 65519099, "block_available": 63914759, "block_used": 1604340, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853435.97846: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853435.97876: _low_level_execute_command(): starting 18714 1726853435.97886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853435.1587214-20161-280845475004824/ > /dev/null 2>&1 && sleep 0' 18714 1726853435.98773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853435.98977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853435.99188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853435.99355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853436.01192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853436.01222: stderr chunk (state=3): >>><<< 18714 1726853436.01231: stdout chunk (state=3): >>><<< 18714 1726853436.01254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853436.01269: handler run complete 18714 1726853436.01399: variable 'ansible_facts' from source: unknown 18714 1726853436.01512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.01809: variable 'ansible_facts' from source: unknown 18714 1726853436.01895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.02019: attempt loop complete, returning result 18714 1726853436.02028: _execute() done 18714 1726853436.02034: dumping result to json 18714 1726853436.02066: done dumping result, returning 18714 1726853436.02081: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-000000000417] 18714 1726853436.02090: sending task result for task 02083763-bbaf-e784-4f7d-000000000417 ok: [managed_node1] 18714 1726853436.03022: no more pending results, returning what we have 18714 1726853436.03026: results queue empty 18714 1726853436.03027: checking for any_errors_fatal 18714 1726853436.03029: done checking for any_errors_fatal 18714 1726853436.03030: checking for max_fail_percentage 18714 1726853436.03031: done checking for max_fail_percentage 18714 1726853436.03032: checking to see if all hosts have failed and the running result is not ok 18714 1726853436.03033: done checking to see if all hosts have failed 18714 1726853436.03033: getting the remaining hosts for this loop 18714 1726853436.03034: done getting the remaining hosts for this loop 18714 1726853436.03038: getting the next task for host managed_node1 18714 1726853436.03043: done getting next task for host managed_node1 18714 1726853436.03045: ^ task is: TASK: meta (flush_handlers) 18714 1726853436.03047: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853436.03050: getting variables 18714 1726853436.03051: in VariableManager get_vars() 18714 1726853436.03078: Calling all_inventory to load vars for managed_node1 18714 1726853436.03081: Calling groups_inventory to load vars for managed_node1 18714 1726853436.03084: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.03177: done sending task result for task 02083763-bbaf-e784-4f7d-000000000417 18714 1726853436.03181: WORKER PROCESS EXITING 18714 1726853436.03191: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.03194: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.03198: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.05148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.07591: done with get_vars() 18714 1726853436.07616: done getting variables 18714 1726853436.07710: in VariableManager get_vars() 18714 1726853436.07746: Calling all_inventory to load vars for managed_node1 18714 1726853436.07749: Calling groups_inventory to load vars for managed_node1 18714 1726853436.07753: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.07758: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.07765: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.07769: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.09054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.10910: done with get_vars() 18714 1726853436.10942: done queuing things up, now waiting for results queue to drain 18714 1726853436.10944: results queue empty 18714 1726853436.10945: checking for any_errors_fatal 18714 1726853436.10949: done checking for any_errors_fatal 18714 1726853436.10950: checking for max_fail_percentage 18714 1726853436.10954: done checking for max_fail_percentage 18714 1726853436.10955: checking to see if all hosts have failed and the running result is not ok 18714 1726853436.10956: done checking to see if all hosts have failed 18714 1726853436.10960: getting the remaining hosts for this loop 18714 1726853436.10962: done getting the remaining hosts for this loop 18714 1726853436.10965: getting the next task for host managed_node1 18714 1726853436.10969: done getting next task for host managed_node1 18714 1726853436.10974: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18714 1726853436.10976: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853436.10986: getting variables 18714 1726853436.10987: in VariableManager get_vars() 18714 1726853436.11002: Calling all_inventory to load vars for managed_node1 18714 1726853436.11005: Calling groups_inventory to load vars for managed_node1 18714 1726853436.11007: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.11011: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.11014: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.11017: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.13083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.14768: done with get_vars() 18714 1726853436.14794: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:30:36 -0400 (0:00:01.042) 0:00:32.532 ****** 18714 1726853436.14880: entering _queue_task() for managed_node1/include_tasks 18714 1726853436.15241: worker is 1 (out of 1 available) 18714 1726853436.15257: exiting _queue_task() for managed_node1/include_tasks 18714 1726853436.15269: done queuing things up, now waiting for results queue to drain 18714 1726853436.15270: waiting for pending results... 18714 1726853436.15531: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18714 1726853436.15644: in run() - task 02083763-bbaf-e784-4f7d-00000000005c 18714 1726853436.15666: variable 'ansible_search_path' from source: unknown 18714 1726853436.15675: variable 'ansible_search_path' from source: unknown 18714 1726853436.15720: calling self._execute() 18714 1726853436.15816: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.15827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.15843: variable 'omit' from source: magic vars 18714 1726853436.16190: variable 'ansible_distribution_major_version' from source: facts 18714 1726853436.16205: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853436.16215: _execute() done 18714 1726853436.16222: dumping result to json 18714 1726853436.16228: done dumping result, returning 18714 1726853436.16275: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-e784-4f7d-00000000005c] 18714 1726853436.16278: sending task result for task 02083763-bbaf-e784-4f7d-00000000005c 18714 1726853436.16343: done sending task result for task 02083763-bbaf-e784-4f7d-00000000005c 18714 1726853436.16348: WORKER PROCESS EXITING 18714 1726853436.16411: no more pending results, returning what we have 18714 1726853436.16417: in VariableManager get_vars() 18714 1726853436.16467: Calling all_inventory to load vars for managed_node1 18714 1726853436.16518: Calling groups_inventory to load vars for managed_node1 18714 1726853436.16521: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.16533: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.16536: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.16538: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.18041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.19727: done with get_vars() 18714 1726853436.19757: variable 'ansible_search_path' from source: unknown 18714 1726853436.19759: variable 'ansible_search_path' from source: unknown 18714 1726853436.19790: we have included files to process 18714 1726853436.19791: generating all_blocks data 18714 1726853436.19792: done generating all_blocks data 18714 1726853436.19793: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853436.19794: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853436.19796: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18714 1726853436.20379: done processing included file 18714 1726853436.20381: iterating over new_blocks loaded from include file 18714 1726853436.20383: in VariableManager get_vars() 18714 1726853436.20409: done with get_vars() 18714 1726853436.20411: filtering new block on tags 18714 1726853436.20427: done filtering new block on tags 18714 1726853436.20430: in VariableManager get_vars() 18714 1726853436.20449: done with get_vars() 18714 1726853436.20454: filtering new block on tags 18714 1726853436.20474: done filtering new block on tags 18714 1726853436.20477: in VariableManager get_vars() 18714 1726853436.20501: done with get_vars() 18714 1726853436.20503: filtering new block on tags 18714 1726853436.20519: done filtering new block on tags 18714 1726853436.20521: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18714 1726853436.20527: extending task lists for all hosts with included blocks 18714 1726853436.20897: done extending task lists 18714 1726853436.20899: done processing included files 18714 1726853436.20899: results queue empty 18714 1726853436.20900: checking for any_errors_fatal 18714 1726853436.20901: done checking for any_errors_fatal 18714 1726853436.20902: checking for max_fail_percentage 18714 1726853436.20903: done checking for max_fail_percentage 18714 1726853436.20903: checking to see if all hosts have failed and the running result is not ok 18714 1726853436.20904: done checking to see if all hosts have failed 18714 1726853436.20905: getting the remaining hosts for this loop 18714 1726853436.20906: done getting the remaining hosts for this loop 18714 1726853436.20908: getting the next task for host managed_node1 18714 1726853436.20911: done getting next task for host managed_node1 18714 1726853436.20914: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18714 1726853436.20916: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853436.20923: getting variables 18714 1726853436.20924: in VariableManager get_vars() 18714 1726853436.20940: Calling all_inventory to load vars for managed_node1 18714 1726853436.20943: Calling groups_inventory to load vars for managed_node1 18714 1726853436.20944: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.20949: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.20950: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.20955: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.22208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.23847: done with get_vars() 18714 1726853436.23878: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:30:36 -0400 (0:00:00.090) 0:00:32.622 ****** 18714 1726853436.23948: entering _queue_task() for managed_node1/setup 18714 1726853436.24485: worker is 1 (out of 1 available) 18714 1726853436.24495: exiting _queue_task() for managed_node1/setup 18714 1726853436.24506: done queuing things up, now waiting for results queue to drain 18714 1726853436.24507: waiting for pending results... 18714 1726853436.24736: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18714 1726853436.24768: in run() - task 02083763-bbaf-e784-4f7d-000000000458 18714 1726853436.24783: variable 'ansible_search_path' from source: unknown 18714 1726853436.24787: variable 'ansible_search_path' from source: unknown 18714 1726853436.24830: calling self._execute() 18714 1726853436.24945: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.24948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.24951: variable 'omit' from source: magic vars 18714 1726853436.25459: variable 'ansible_distribution_major_version' from source: facts 18714 1726853436.25463: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853436.25876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853436.27887: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853436.27954: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853436.27994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853436.28034: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853436.28064: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853436.28149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853436.28184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853436.28210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853436.28262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853436.28278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853436.28334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853436.28361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853436.28388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853436.28426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853436.28448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853436.28619: variable '__network_required_facts' from source: role '' defaults 18714 1726853436.28630: variable 'ansible_facts' from source: unknown 18714 1726853436.29376: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18714 1726853436.29380: when evaluation is False, skipping this task 18714 1726853436.29383: _execute() done 18714 1726853436.29386: dumping result to json 18714 1726853436.29388: done dumping result, returning 18714 1726853436.29394: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-e784-4f7d-000000000458] 18714 1726853436.29399: sending task result for task 02083763-bbaf-e784-4f7d-000000000458 18714 1726853436.29493: done sending task result for task 02083763-bbaf-e784-4f7d-000000000458 18714 1726853436.29496: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853436.29565: no more pending results, returning what we have 18714 1726853436.29569: results queue empty 18714 1726853436.29570: checking for any_errors_fatal 18714 1726853436.29573: done checking for any_errors_fatal 18714 1726853436.29574: checking for max_fail_percentage 18714 1726853436.29576: done checking for max_fail_percentage 18714 1726853436.29576: checking to see if all hosts have failed and the running result is not ok 18714 1726853436.29577: done checking to see if all hosts have failed 18714 1726853436.29578: getting the remaining hosts for this loop 18714 1726853436.29579: done getting the remaining hosts for this loop 18714 1726853436.29583: getting the next task for host managed_node1 18714 1726853436.29592: done getting next task for host managed_node1 18714 1726853436.29596: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18714 1726853436.29598: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853436.29613: getting variables 18714 1726853436.29615: in VariableManager get_vars() 18714 1726853436.29661: Calling all_inventory to load vars for managed_node1 18714 1726853436.29665: Calling groups_inventory to load vars for managed_node1 18714 1726853436.29667: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.29687: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.29691: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.29694: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.31424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.33126: done with get_vars() 18714 1726853436.33154: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:30:36 -0400 (0:00:00.093) 0:00:32.716 ****** 18714 1726853436.33260: entering _queue_task() for managed_node1/stat 18714 1726853436.33803: worker is 1 (out of 1 available) 18714 1726853436.33813: exiting _queue_task() for managed_node1/stat 18714 1726853436.33823: done queuing things up, now waiting for results queue to drain 18714 1726853436.33824: waiting for pending results... 18714 1726853436.33942: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18714 1726853436.34078: in run() - task 02083763-bbaf-e784-4f7d-00000000045a 18714 1726853436.34082: variable 'ansible_search_path' from source: unknown 18714 1726853436.34085: variable 'ansible_search_path' from source: unknown 18714 1726853436.34103: calling self._execute() 18714 1726853436.34277: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.34280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.34283: variable 'omit' from source: magic vars 18714 1726853436.34572: variable 'ansible_distribution_major_version' from source: facts 18714 1726853436.34583: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853436.34759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853436.35049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853436.35096: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853436.35123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853436.35163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853436.35285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853436.35288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853436.35306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853436.35337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853436.35423: variable '__network_is_ostree' from source: set_fact 18714 1726853436.35431: Evaluated conditional (not __network_is_ostree is defined): False 18714 1726853436.35434: when evaluation is False, skipping this task 18714 1726853436.35436: _execute() done 18714 1726853436.35439: dumping result to json 18714 1726853436.35443: done dumping result, returning 18714 1726853436.35512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-e784-4f7d-00000000045a] 18714 1726853436.35515: sending task result for task 02083763-bbaf-e784-4f7d-00000000045a 18714 1726853436.35574: done sending task result for task 02083763-bbaf-e784-4f7d-00000000045a 18714 1726853436.35578: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18714 1726853436.35630: no more pending results, returning what we have 18714 1726853436.35634: results queue empty 18714 1726853436.35635: checking for any_errors_fatal 18714 1726853436.35643: done checking for any_errors_fatal 18714 1726853436.35643: checking for max_fail_percentage 18714 1726853436.35645: done checking for max_fail_percentage 18714 1726853436.35646: checking to see if all hosts have failed and the running result is not ok 18714 1726853436.35647: done checking to see if all hosts have failed 18714 1726853436.35647: getting the remaining hosts for this loop 18714 1726853436.35649: done getting the remaining hosts for this loop 18714 1726853436.35655: getting the next task for host managed_node1 18714 1726853436.35663: done getting next task for host managed_node1 18714 1726853436.35666: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18714 1726853436.35669: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853436.35687: getting variables 18714 1726853436.35689: in VariableManager get_vars() 18714 1726853436.35728: Calling all_inventory to load vars for managed_node1 18714 1726853436.35732: Calling groups_inventory to load vars for managed_node1 18714 1726853436.35734: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.35745: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.35749: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.35755: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.37357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.39187: done with get_vars() 18714 1726853436.39209: done getting variables 18714 1726853436.39270: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:30:36 -0400 (0:00:00.060) 0:00:32.776 ****** 18714 1726853436.39303: entering _queue_task() for managed_node1/set_fact 18714 1726853436.39620: worker is 1 (out of 1 available) 18714 1726853436.39634: exiting _queue_task() for managed_node1/set_fact 18714 1726853436.39644: done queuing things up, now waiting for results queue to drain 18714 1726853436.39645: waiting for pending results... 18714 1726853436.40017: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18714 1726853436.40111: in run() - task 02083763-bbaf-e784-4f7d-00000000045b 18714 1726853436.40118: variable 'ansible_search_path' from source: unknown 18714 1726853436.40121: variable 'ansible_search_path' from source: unknown 18714 1726853436.40128: calling self._execute() 18714 1726853436.40234: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.40237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.40254: variable 'omit' from source: magic vars 18714 1726853436.40645: variable 'ansible_distribution_major_version' from source: facts 18714 1726853436.40663: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853436.40836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853436.41277: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853436.41281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853436.41283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853436.41286: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853436.41466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853436.41470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853436.41475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853436.41478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853436.41480: variable '__network_is_ostree' from source: set_fact 18714 1726853436.41488: Evaluated conditional (not __network_is_ostree is defined): False 18714 1726853436.41491: when evaluation is False, skipping this task 18714 1726853436.41494: _execute() done 18714 1726853436.41496: dumping result to json 18714 1726853436.41501: done dumping result, returning 18714 1726853436.41509: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-e784-4f7d-00000000045b] 18714 1726853436.41511: sending task result for task 02083763-bbaf-e784-4f7d-00000000045b 18714 1726853436.41604: done sending task result for task 02083763-bbaf-e784-4f7d-00000000045b 18714 1726853436.41608: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18714 1726853436.41661: no more pending results, returning what we have 18714 1726853436.41664: results queue empty 18714 1726853436.41666: checking for any_errors_fatal 18714 1726853436.41674: done checking for any_errors_fatal 18714 1726853436.41675: checking for max_fail_percentage 18714 1726853436.41677: done checking for max_fail_percentage 18714 1726853436.41678: checking to see if all hosts have failed and the running result is not ok 18714 1726853436.41679: done checking to see if all hosts have failed 18714 1726853436.41680: getting the remaining hosts for this loop 18714 1726853436.41681: done getting the remaining hosts for this loop 18714 1726853436.41685: getting the next task for host managed_node1 18714 1726853436.41695: done getting next task for host managed_node1 18714 1726853436.41699: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18714 1726853436.41702: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853436.41716: getting variables 18714 1726853436.41718: in VariableManager get_vars() 18714 1726853436.41758: Calling all_inventory to load vars for managed_node1 18714 1726853436.41761: Calling groups_inventory to load vars for managed_node1 18714 1726853436.41764: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853436.41881: Calling all_plugins_play to load vars for managed_node1 18714 1726853436.41885: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853436.41889: Calling groups_plugins_play to load vars for managed_node1 18714 1726853436.43403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853436.45035: done with get_vars() 18714 1726853436.45061: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:30:36 -0400 (0:00:00.058) 0:00:32.835 ****** 18714 1726853436.45168: entering _queue_task() for managed_node1/service_facts 18714 1726853436.45694: worker is 1 (out of 1 available) 18714 1726853436.45704: exiting _queue_task() for managed_node1/service_facts 18714 1726853436.45712: done queuing things up, now waiting for results queue to drain 18714 1726853436.45713: waiting for pending results... 18714 1726853436.45810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18714 1726853436.45977: in run() - task 02083763-bbaf-e784-4f7d-00000000045d 18714 1726853436.45981: variable 'ansible_search_path' from source: unknown 18714 1726853436.45984: variable 'ansible_search_path' from source: unknown 18714 1726853436.45996: calling self._execute() 18714 1726853436.46099: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.46103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.46119: variable 'omit' from source: magic vars 18714 1726853436.46576: variable 'ansible_distribution_major_version' from source: facts 18714 1726853436.46580: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853436.46583: variable 'omit' from source: magic vars 18714 1726853436.46586: variable 'omit' from source: magic vars 18714 1726853436.46614: variable 'omit' from source: magic vars 18714 1726853436.46649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853436.46687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853436.46976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853436.46979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853436.46982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853436.46984: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853436.46987: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.46989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.46990: Set connection var ansible_shell_executable to /bin/sh 18714 1726853436.46993: Set connection var ansible_timeout to 10 18714 1726853436.46995: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853436.46997: Set connection var ansible_connection to ssh 18714 1726853436.46999: Set connection var ansible_shell_type to sh 18714 1726853436.47001: Set connection var ansible_pipelining to False 18714 1726853436.47003: variable 'ansible_shell_executable' from source: unknown 18714 1726853436.47005: variable 'ansible_connection' from source: unknown 18714 1726853436.47008: variable 'ansible_module_compression' from source: unknown 18714 1726853436.47010: variable 'ansible_shell_type' from source: unknown 18714 1726853436.47012: variable 'ansible_shell_executable' from source: unknown 18714 1726853436.47014: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853436.47016: variable 'ansible_pipelining' from source: unknown 18714 1726853436.47018: variable 'ansible_timeout' from source: unknown 18714 1726853436.47020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853436.47164: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853436.47176: variable 'omit' from source: magic vars 18714 1726853436.47181: starting attempt loop 18714 1726853436.47184: running the handler 18714 1726853436.47199: _low_level_execute_command(): starting 18714 1726853436.47206: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853436.47962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853436.47982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853436.47999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853436.48018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853436.48033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853436.48040: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853436.48133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853436.48144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853436.48159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853436.48180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853436.48259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853436.49947: stdout chunk (state=3): >>>/root <<< 18714 1726853436.50105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853436.50108: stdout chunk (state=3): >>><<< 18714 1726853436.50111: stderr chunk (state=3): >>><<< 18714 1726853436.50226: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853436.50230: _low_level_execute_command(): starting 18714 1726853436.50233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583 `" && echo ansible-tmp-1726853436.501343-20227-59860084836583="` echo /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583 `" ) && sleep 0' 18714 1726853436.50803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853436.50887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853436.50891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853436.50942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853436.50962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853436.50992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853436.51109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853436.52998: stdout chunk (state=3): >>>ansible-tmp-1726853436.501343-20227-59860084836583=/root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583 <<< 18714 1726853436.53103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853436.53146: stderr chunk (state=3): >>><<< 18714 1726853436.53163: stdout chunk (state=3): >>><<< 18714 1726853436.53189: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853436.501343-20227-59860084836583=/root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853436.53240: variable 'ansible_module_compression' from source: unknown 18714 1726853436.53287: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18714 1726853436.53336: variable 'ansible_facts' from source: unknown 18714 1726853436.53443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py 18714 1726853436.53665: Sending initial data 18714 1726853436.53668: Sent initial data (160 bytes) 18714 1726853436.54227: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853436.54243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853436.54277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853436.54317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853436.54328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853436.54433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853436.54470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853436.54516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853436.56090: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853436.56136: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853436.56192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpctunyiio /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py <<< 18714 1726853436.56196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py" <<< 18714 1726853436.56243: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpctunyiio" to remote "/root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py" <<< 18714 1726853436.57140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853436.57143: stdout chunk (state=3): >>><<< 18714 1726853436.57145: stderr chunk (state=3): >>><<< 18714 1726853436.57147: done transferring module to remote 18714 1726853436.57149: _low_level_execute_command(): starting 18714 1726853436.57151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/ /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py && sleep 0' 18714 1726853436.57744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853436.57802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853436.57877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853436.57902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853436.57927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853436.57993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853436.59791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853436.59794: stdout chunk (state=3): >>><<< 18714 1726853436.59797: stderr chunk (state=3): >>><<< 18714 1726853436.59812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853436.59821: _low_level_execute_command(): starting 18714 1726853436.59832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/AnsiballZ_service_facts.py && sleep 0' 18714 1726853436.60789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853436.60805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853436.60821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853436.60838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853436.60866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853436.60886: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853436.60983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853436.61001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853436.61017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853436.61038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853436.61115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.12731: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 18714 1726853438.12797: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 18714 1726853438.12841: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18714 1726853438.14333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853438.14337: stdout chunk (state=3): >>><<< 18714 1726853438.14339: stderr chunk (state=3): >>><<< 18714 1726853438.14478: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853438.15136: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853438.15233: _low_level_execute_command(): starting 18714 1726853438.15236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853436.501343-20227-59860084836583/ > /dev/null 2>&1 && sleep 0' 18714 1726853438.15886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853438.15956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853438.15985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853438.15999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853438.16310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.18376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853438.18379: stdout chunk (state=3): >>><<< 18714 1726853438.18381: stderr chunk (state=3): >>><<< 18714 1726853438.18383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853438.18385: handler run complete 18714 1726853438.18407: variable 'ansible_facts' from source: unknown 18714 1726853438.18567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853438.19397: variable 'ansible_facts' from source: unknown 18714 1726853438.19535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853438.19761: attempt loop complete, returning result 18714 1726853438.19775: _execute() done 18714 1726853438.19783: dumping result to json 18714 1726853438.19858: done dumping result, returning 18714 1726853438.19874: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-e784-4f7d-00000000045d] 18714 1726853438.19884: sending task result for task 02083763-bbaf-e784-4f7d-00000000045d 18714 1726853438.21513: done sending task result for task 02083763-bbaf-e784-4f7d-00000000045d 18714 1726853438.21516: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853438.21636: no more pending results, returning what we have 18714 1726853438.21640: results queue empty 18714 1726853438.21641: checking for any_errors_fatal 18714 1726853438.21644: done checking for any_errors_fatal 18714 1726853438.21645: checking for max_fail_percentage 18714 1726853438.21647: done checking for max_fail_percentage 18714 1726853438.21648: checking to see if all hosts have failed and the running result is not ok 18714 1726853438.21649: done checking to see if all hosts have failed 18714 1726853438.21649: getting the remaining hosts for this loop 18714 1726853438.21650: done getting the remaining hosts for this loop 18714 1726853438.21657: getting the next task for host managed_node1 18714 1726853438.21663: done getting next task for host managed_node1 18714 1726853438.21666: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18714 1726853438.21669: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853438.21682: getting variables 18714 1726853438.21684: in VariableManager get_vars() 18714 1726853438.21713: Calling all_inventory to load vars for managed_node1 18714 1726853438.21716: Calling groups_inventory to load vars for managed_node1 18714 1726853438.21719: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853438.21727: Calling all_plugins_play to load vars for managed_node1 18714 1726853438.21730: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853438.21733: Calling groups_plugins_play to load vars for managed_node1 18714 1726853438.23002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853438.25254: done with get_vars() 18714 1726853438.25281: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:30:38 -0400 (0:00:01.802) 0:00:34.637 ****** 18714 1726853438.25379: entering _queue_task() for managed_node1/package_facts 18714 1726853438.25819: worker is 1 (out of 1 available) 18714 1726853438.25831: exiting _queue_task() for managed_node1/package_facts 18714 1726853438.25841: done queuing things up, now waiting for results queue to drain 18714 1726853438.25842: waiting for pending results... 18714 1726853438.26189: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18714 1726853438.26234: in run() - task 02083763-bbaf-e784-4f7d-00000000045e 18714 1726853438.26261: variable 'ansible_search_path' from source: unknown 18714 1726853438.26269: variable 'ansible_search_path' from source: unknown 18714 1726853438.26319: calling self._execute() 18714 1726853438.26473: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853438.26502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853438.26505: variable 'omit' from source: magic vars 18714 1726853438.26938: variable 'ansible_distribution_major_version' from source: facts 18714 1726853438.26948: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853438.26976: variable 'omit' from source: magic vars 18714 1726853438.27031: variable 'omit' from source: magic vars 18714 1726853438.27104: variable 'omit' from source: magic vars 18714 1726853438.27130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853438.27182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853438.27214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853438.27276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853438.27279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853438.27296: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853438.27304: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853438.27312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853438.27437: Set connection var ansible_shell_executable to /bin/sh 18714 1726853438.27540: Set connection var ansible_timeout to 10 18714 1726853438.27543: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853438.27546: Set connection var ansible_connection to ssh 18714 1726853438.27548: Set connection var ansible_shell_type to sh 18714 1726853438.27550: Set connection var ansible_pipelining to False 18714 1726853438.27555: variable 'ansible_shell_executable' from source: unknown 18714 1726853438.27557: variable 'ansible_connection' from source: unknown 18714 1726853438.27559: variable 'ansible_module_compression' from source: unknown 18714 1726853438.27561: variable 'ansible_shell_type' from source: unknown 18714 1726853438.27563: variable 'ansible_shell_executable' from source: unknown 18714 1726853438.27565: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853438.27567: variable 'ansible_pipelining' from source: unknown 18714 1726853438.27569: variable 'ansible_timeout' from source: unknown 18714 1726853438.27573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853438.27782: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853438.27869: variable 'omit' from source: magic vars 18714 1726853438.27874: starting attempt loop 18714 1726853438.27877: running the handler 18714 1726853438.27879: _low_level_execute_command(): starting 18714 1726853438.27881: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853438.28686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853438.28763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853438.28790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853438.28813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853438.28879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.30518: stdout chunk (state=3): >>>/root <<< 18714 1726853438.30982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853438.30986: stdout chunk (state=3): >>><<< 18714 1726853438.30988: stderr chunk (state=3): >>><<< 18714 1726853438.30992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853438.30995: _low_level_execute_command(): starting 18714 1726853438.31000: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250 `" && echo ansible-tmp-1726853438.3088436-20325-80213770372250="` echo /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250 `" ) && sleep 0' 18714 1726853438.32108: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853438.32375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853438.32401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853438.32546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.34333: stdout chunk (state=3): >>>ansible-tmp-1726853438.3088436-20325-80213770372250=/root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250 <<< 18714 1726853438.34531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853438.34709: stderr chunk (state=3): >>><<< 18714 1726853438.34712: stdout chunk (state=3): >>><<< 18714 1726853438.34733: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853438.3088436-20325-80213770372250=/root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853438.34782: variable 'ansible_module_compression' from source: unknown 18714 1726853438.34829: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18714 1726853438.34901: variable 'ansible_facts' from source: unknown 18714 1726853438.35296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py 18714 1726853438.35544: Sending initial data 18714 1726853438.35547: Sent initial data (161 bytes) 18714 1726853438.37205: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853438.37214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853438.37225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853438.37239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853438.37256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853438.37259: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853438.37268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853438.37290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853438.37487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853438.37690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853438.37756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.39309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18714 1726853438.39322: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853438.39349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853438.39398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpumjrt_ic /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py <<< 18714 1726853438.39406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py" <<< 18714 1726853438.39435: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpumjrt_ic" to remote "/root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py" <<< 18714 1726853438.39439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py" <<< 18714 1726853438.42333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853438.42337: stderr chunk (state=3): >>><<< 18714 1726853438.42340: stdout chunk (state=3): >>><<< 18714 1726853438.42488: done transferring module to remote 18714 1726853438.42494: _low_level_execute_command(): starting 18714 1726853438.42500: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/ /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py && sleep 0' 18714 1726853438.43778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853438.43782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853438.43784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853438.43787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853438.43791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853438.43985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853438.43989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853438.43996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853438.43998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.45978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853438.45982: stdout chunk (state=3): >>><<< 18714 1726853438.45984: stderr chunk (state=3): >>><<< 18714 1726853438.45987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853438.45990: _low_level_execute_command(): starting 18714 1726853438.45992: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/AnsiballZ_package_facts.py && sleep 0' 18714 1726853438.46898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853438.46918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853438.46932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853438.46954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853438.46975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853438.46992: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853438.47035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853438.47108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853438.47149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853438.47181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853438.47228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853438.91953: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 18714 1726853438.92045: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 18714 1726853438.92080: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18714 1726853438.93962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853438.93965: stdout chunk (state=3): >>><<< 18714 1726853438.93968: stderr chunk (state=3): >>><<< 18714 1726853438.94185: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853439.02867: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853439.02897: _low_level_execute_command(): starting 18714 1726853439.02908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853438.3088436-20325-80213770372250/ > /dev/null 2>&1 && sleep 0' 18714 1726853439.03519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853439.03534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853439.03548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853439.03573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853439.03590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853439.03601: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853439.03613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853439.03631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853439.03642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853439.03654: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853439.03739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853439.03766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853439.03844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853439.05742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853439.05802: stderr chunk (state=3): >>><<< 18714 1726853439.05818: stdout chunk (state=3): >>><<< 18714 1726853439.05835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853439.05844: handler run complete 18714 1726853439.06680: variable 'ansible_facts' from source: unknown 18714 1726853439.07131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.09366: variable 'ansible_facts' from source: unknown 18714 1726853439.09796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.10517: attempt loop complete, returning result 18714 1726853439.10533: _execute() done 18714 1726853439.10538: dumping result to json 18714 1726853439.10757: done dumping result, returning 18714 1726853439.10768: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-e784-4f7d-00000000045e] 18714 1726853439.10836: sending task result for task 02083763-bbaf-e784-4f7d-00000000045e 18714 1726853439.18133: done sending task result for task 02083763-bbaf-e784-4f7d-00000000045e 18714 1726853439.18136: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853439.18255: no more pending results, returning what we have 18714 1726853439.18258: results queue empty 18714 1726853439.18259: checking for any_errors_fatal 18714 1726853439.18262: done checking for any_errors_fatal 18714 1726853439.18263: checking for max_fail_percentage 18714 1726853439.18264: done checking for max_fail_percentage 18714 1726853439.18265: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.18266: done checking to see if all hosts have failed 18714 1726853439.18267: getting the remaining hosts for this loop 18714 1726853439.18268: done getting the remaining hosts for this loop 18714 1726853439.18272: getting the next task for host managed_node1 18714 1726853439.18277: done getting next task for host managed_node1 18714 1726853439.18280: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18714 1726853439.18282: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.18289: getting variables 18714 1726853439.18290: in VariableManager get_vars() 18714 1726853439.18312: Calling all_inventory to load vars for managed_node1 18714 1726853439.18314: Calling groups_inventory to load vars for managed_node1 18714 1726853439.18316: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.18323: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.18325: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.18328: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.19543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.22737: done with get_vars() 18714 1726853439.22765: done getting variables 18714 1726853439.22818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:30:39 -0400 (0:00:00.974) 0:00:35.611 ****** 18714 1726853439.22849: entering _queue_task() for managed_node1/debug 18714 1726853439.23619: worker is 1 (out of 1 available) 18714 1726853439.23633: exiting _queue_task() for managed_node1/debug 18714 1726853439.23644: done queuing things up, now waiting for results queue to drain 18714 1726853439.23645: waiting for pending results... 18714 1726853439.24313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18714 1726853439.24477: in run() - task 02083763-bbaf-e784-4f7d-00000000005d 18714 1726853439.24480: variable 'ansible_search_path' from source: unknown 18714 1726853439.24483: variable 'ansible_search_path' from source: unknown 18714 1726853439.24485: calling self._execute() 18714 1726853439.24573: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.24585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.24598: variable 'omit' from source: magic vars 18714 1726853439.24970: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.24987: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.24997: variable 'omit' from source: magic vars 18714 1726853439.25041: variable 'omit' from source: magic vars 18714 1726853439.25141: variable 'network_provider' from source: set_fact 18714 1726853439.25280: variable 'omit' from source: magic vars 18714 1726853439.25283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853439.25286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853439.25288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853439.25290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853439.25302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853439.25333: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853439.25341: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.25347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.25449: Set connection var ansible_shell_executable to /bin/sh 18714 1726853439.25463: Set connection var ansible_timeout to 10 18714 1726853439.25474: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853439.25486: Set connection var ansible_connection to ssh 18714 1726853439.25495: Set connection var ansible_shell_type to sh 18714 1726853439.25503: Set connection var ansible_pipelining to False 18714 1726853439.25526: variable 'ansible_shell_executable' from source: unknown 18714 1726853439.25976: variable 'ansible_connection' from source: unknown 18714 1726853439.25980: variable 'ansible_module_compression' from source: unknown 18714 1726853439.25983: variable 'ansible_shell_type' from source: unknown 18714 1726853439.25985: variable 'ansible_shell_executable' from source: unknown 18714 1726853439.25988: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.25990: variable 'ansible_pipelining' from source: unknown 18714 1726853439.25992: variable 'ansible_timeout' from source: unknown 18714 1726853439.25994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.25997: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853439.25999: variable 'omit' from source: magic vars 18714 1726853439.26002: starting attempt loop 18714 1726853439.26005: running the handler 18714 1726853439.26033: handler run complete 18714 1726853439.26056: attempt loop complete, returning result 18714 1726853439.26476: _execute() done 18714 1726853439.26481: dumping result to json 18714 1726853439.26484: done dumping result, returning 18714 1726853439.26487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-e784-4f7d-00000000005d] 18714 1726853439.26489: sending task result for task 02083763-bbaf-e784-4f7d-00000000005d 18714 1726853439.26562: done sending task result for task 02083763-bbaf-e784-4f7d-00000000005d 18714 1726853439.26565: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 18714 1726853439.26631: no more pending results, returning what we have 18714 1726853439.26634: results queue empty 18714 1726853439.26635: checking for any_errors_fatal 18714 1726853439.26643: done checking for any_errors_fatal 18714 1726853439.26644: checking for max_fail_percentage 18714 1726853439.26646: done checking for max_fail_percentage 18714 1726853439.26646: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.26647: done checking to see if all hosts have failed 18714 1726853439.26648: getting the remaining hosts for this loop 18714 1726853439.26649: done getting the remaining hosts for this loop 18714 1726853439.26652: getting the next task for host managed_node1 18714 1726853439.26658: done getting next task for host managed_node1 18714 1726853439.26662: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18714 1726853439.26663: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.26677: getting variables 18714 1726853439.26679: in VariableManager get_vars() 18714 1726853439.26710: Calling all_inventory to load vars for managed_node1 18714 1726853439.26712: Calling groups_inventory to load vars for managed_node1 18714 1726853439.26714: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.26722: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.26724: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.26727: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.29645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.33698: done with get_vars() 18714 1726853439.33727: done getting variables 18714 1726853439.33885: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:30:39 -0400 (0:00:00.110) 0:00:35.722 ****** 18714 1726853439.33916: entering _queue_task() for managed_node1/fail 18714 1726853439.34549: worker is 1 (out of 1 available) 18714 1726853439.34561: exiting _queue_task() for managed_node1/fail 18714 1726853439.34575: done queuing things up, now waiting for results queue to drain 18714 1726853439.34576: waiting for pending results... 18714 1726853439.35020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18714 1726853439.35247: in run() - task 02083763-bbaf-e784-4f7d-00000000005e 18714 1726853439.35289: variable 'ansible_search_path' from source: unknown 18714 1726853439.35455: variable 'ansible_search_path' from source: unknown 18714 1726853439.35459: calling self._execute() 18714 1726853439.35739: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.35744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.35748: variable 'omit' from source: magic vars 18714 1726853439.36876: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.37276: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.37279: variable 'network_state' from source: role '' defaults 18714 1726853439.37282: Evaluated conditional (network_state != {}): False 18714 1726853439.37285: when evaluation is False, skipping this task 18714 1726853439.37287: _execute() done 18714 1726853439.37289: dumping result to json 18714 1726853439.37291: done dumping result, returning 18714 1726853439.37294: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-e784-4f7d-00000000005e] 18714 1726853439.37297: sending task result for task 02083763-bbaf-e784-4f7d-00000000005e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853439.37667: no more pending results, returning what we have 18714 1726853439.37715: results queue empty 18714 1726853439.37716: checking for any_errors_fatal 18714 1726853439.37724: done checking for any_errors_fatal 18714 1726853439.37725: checking for max_fail_percentage 18714 1726853439.37727: done checking for max_fail_percentage 18714 1726853439.37728: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.37728: done checking to see if all hosts have failed 18714 1726853439.37729: getting the remaining hosts for this loop 18714 1726853439.37730: done getting the remaining hosts for this loop 18714 1726853439.37734: getting the next task for host managed_node1 18714 1726853439.37740: done getting next task for host managed_node1 18714 1726853439.37744: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18714 1726853439.37747: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.37760: getting variables 18714 1726853439.37761: in VariableManager get_vars() 18714 1726853439.37802: Calling all_inventory to load vars for managed_node1 18714 1726853439.37805: Calling groups_inventory to load vars for managed_node1 18714 1726853439.37807: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.37890: done sending task result for task 02083763-bbaf-e784-4f7d-00000000005e 18714 1726853439.37893: WORKER PROCESS EXITING 18714 1726853439.37906: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.37909: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.37912: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.40903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.44376: done with get_vars() 18714 1726853439.44406: done getting variables 18714 1726853439.44647: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:30:39 -0400 (0:00:00.107) 0:00:35.830 ****** 18714 1726853439.44696: entering _queue_task() for managed_node1/fail 18714 1726853439.45394: worker is 1 (out of 1 available) 18714 1726853439.45539: exiting _queue_task() for managed_node1/fail 18714 1726853439.45551: done queuing things up, now waiting for results queue to drain 18714 1726853439.45552: waiting for pending results... 18714 1726853439.45869: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18714 1726853439.46230: in run() - task 02083763-bbaf-e784-4f7d-00000000005f 18714 1726853439.46255: variable 'ansible_search_path' from source: unknown 18714 1726853439.46265: variable 'ansible_search_path' from source: unknown 18714 1726853439.46317: calling self._execute() 18714 1726853439.46533: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.46550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.46612: variable 'omit' from source: magic vars 18714 1726853439.47007: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.47024: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.47144: variable 'network_state' from source: role '' defaults 18714 1726853439.47193: Evaluated conditional (network_state != {}): False 18714 1726853439.47196: when evaluation is False, skipping this task 18714 1726853439.47198: _execute() done 18714 1726853439.47200: dumping result to json 18714 1726853439.47202: done dumping result, returning 18714 1726853439.47205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-e784-4f7d-00000000005f] 18714 1726853439.47207: sending task result for task 02083763-bbaf-e784-4f7d-00000000005f 18714 1726853439.47411: done sending task result for task 02083763-bbaf-e784-4f7d-00000000005f 18714 1726853439.47415: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853439.47462: no more pending results, returning what we have 18714 1726853439.47466: results queue empty 18714 1726853439.47468: checking for any_errors_fatal 18714 1726853439.47477: done checking for any_errors_fatal 18714 1726853439.47478: checking for max_fail_percentage 18714 1726853439.47480: done checking for max_fail_percentage 18714 1726853439.47480: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.47481: done checking to see if all hosts have failed 18714 1726853439.47481: getting the remaining hosts for this loop 18714 1726853439.47483: done getting the remaining hosts for this loop 18714 1726853439.47486: getting the next task for host managed_node1 18714 1726853439.47639: done getting next task for host managed_node1 18714 1726853439.47644: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18714 1726853439.47646: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.47661: getting variables 18714 1726853439.47663: in VariableManager get_vars() 18714 1726853439.47700: Calling all_inventory to load vars for managed_node1 18714 1726853439.47703: Calling groups_inventory to load vars for managed_node1 18714 1726853439.47706: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.47715: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.47718: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.47721: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.49850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.51668: done with get_vars() 18714 1726853439.51751: done getting variables 18714 1726853439.51833: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:30:39 -0400 (0:00:00.071) 0:00:35.902 ****** 18714 1726853439.51863: entering _queue_task() for managed_node1/fail 18714 1726853439.52305: worker is 1 (out of 1 available) 18714 1726853439.52317: exiting _queue_task() for managed_node1/fail 18714 1726853439.52326: done queuing things up, now waiting for results queue to drain 18714 1726853439.52327: waiting for pending results... 18714 1726853439.52575: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18714 1726853439.52990: in run() - task 02083763-bbaf-e784-4f7d-000000000060 18714 1726853439.52993: variable 'ansible_search_path' from source: unknown 18714 1726853439.52995: variable 'ansible_search_path' from source: unknown 18714 1726853439.52998: calling self._execute() 18714 1726853439.53062: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.53314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.53317: variable 'omit' from source: magic vars 18714 1726853439.53925: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.53974: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.54181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853439.56478: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853439.56542: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853439.56595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853439.56638: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853439.56675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853439.56767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.56824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.56859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.56914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.56933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.57046: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.57068: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18714 1726853439.57231: variable 'ansible_distribution' from source: facts 18714 1726853439.57234: variable '__network_rh_distros' from source: role '' defaults 18714 1726853439.57236: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18714 1726853439.57484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.57515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.57547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.57604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.57666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.57685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.57714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.57746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.57799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.57819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.57948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.58076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.58079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.58081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.58083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.58401: variable 'network_connections' from source: play vars 18714 1726853439.58416: variable 'profile' from source: play vars 18714 1726853439.58495: variable 'profile' from source: play vars 18714 1726853439.58504: variable 'interface' from source: set_fact 18714 1726853439.58573: variable 'interface' from source: set_fact 18714 1726853439.58588: variable 'network_state' from source: role '' defaults 18714 1726853439.58665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853439.58836: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853439.59077: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853439.59081: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853439.59083: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853439.59085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853439.59095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853439.59098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.59100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853439.59116: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18714 1726853439.59124: when evaluation is False, skipping this task 18714 1726853439.59132: _execute() done 18714 1726853439.59139: dumping result to json 18714 1726853439.59147: done dumping result, returning 18714 1726853439.59159: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-e784-4f7d-000000000060] 18714 1726853439.59169: sending task result for task 02083763-bbaf-e784-4f7d-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18714 1726853439.59367: no more pending results, returning what we have 18714 1726853439.59373: results queue empty 18714 1726853439.59375: checking for any_errors_fatal 18714 1726853439.59382: done checking for any_errors_fatal 18714 1726853439.59383: checking for max_fail_percentage 18714 1726853439.59385: done checking for max_fail_percentage 18714 1726853439.59385: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.59386: done checking to see if all hosts have failed 18714 1726853439.59387: getting the remaining hosts for this loop 18714 1726853439.59388: done getting the remaining hosts for this loop 18714 1726853439.59392: getting the next task for host managed_node1 18714 1726853439.59400: done getting next task for host managed_node1 18714 1726853439.59405: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18714 1726853439.59407: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.59420: getting variables 18714 1726853439.59422: in VariableManager get_vars() 18714 1726853439.59475: Calling all_inventory to load vars for managed_node1 18714 1726853439.59478: Calling groups_inventory to load vars for managed_node1 18714 1726853439.59481: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.59493: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.59497: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.59500: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.60100: done sending task result for task 02083763-bbaf-e784-4f7d-000000000060 18714 1726853439.60103: WORKER PROCESS EXITING 18714 1726853439.61209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.63535: done with get_vars() 18714 1726853439.63559: done getting variables 18714 1726853439.63626: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:30:39 -0400 (0:00:00.117) 0:00:36.020 ****** 18714 1726853439.63662: entering _queue_task() for managed_node1/dnf 18714 1726853439.64013: worker is 1 (out of 1 available) 18714 1726853439.64026: exiting _queue_task() for managed_node1/dnf 18714 1726853439.64039: done queuing things up, now waiting for results queue to drain 18714 1726853439.64040: waiting for pending results... 18714 1726853439.64347: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18714 1726853439.64495: in run() - task 02083763-bbaf-e784-4f7d-000000000061 18714 1726853439.64508: variable 'ansible_search_path' from source: unknown 18714 1726853439.64603: variable 'ansible_search_path' from source: unknown 18714 1726853439.64607: calling self._execute() 18714 1726853439.64669: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.64684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.64700: variable 'omit' from source: magic vars 18714 1726853439.65104: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.65121: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.65331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853439.67578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853439.67651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853439.67699: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853439.67737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853439.68001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853439.68084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.68477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.68489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.68535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.68558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.68677: variable 'ansible_distribution' from source: facts 18714 1726853439.68681: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.68876: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18714 1726853439.68880: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853439.68951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.68987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.69018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.69065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.69088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.69133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.69164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.69199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.69243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.69266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.69311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.69338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.69374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.69419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.69440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.69599: variable 'network_connections' from source: play vars 18714 1726853439.69618: variable 'profile' from source: play vars 18714 1726853439.69691: variable 'profile' from source: play vars 18714 1726853439.69701: variable 'interface' from source: set_fact 18714 1726853439.69764: variable 'interface' from source: set_fact 18714 1726853439.69841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853439.70026: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853439.70073: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853439.70108: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853439.70141: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853439.70192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853439.70219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853439.70261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.70295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853439.70377: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853439.70584: variable 'network_connections' from source: play vars 18714 1726853439.70596: variable 'profile' from source: play vars 18714 1726853439.70660: variable 'profile' from source: play vars 18714 1726853439.70670: variable 'interface' from source: set_fact 18714 1726853439.70736: variable 'interface' from source: set_fact 18714 1726853439.70777: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853439.70780: when evaluation is False, skipping this task 18714 1726853439.70783: _execute() done 18714 1726853439.70877: dumping result to json 18714 1726853439.70880: done dumping result, returning 18714 1726853439.70883: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000061] 18714 1726853439.70885: sending task result for task 02083763-bbaf-e784-4f7d-000000000061 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853439.71026: no more pending results, returning what we have 18714 1726853439.71031: results queue empty 18714 1726853439.71032: checking for any_errors_fatal 18714 1726853439.71038: done checking for any_errors_fatal 18714 1726853439.71039: checking for max_fail_percentage 18714 1726853439.71041: done checking for max_fail_percentage 18714 1726853439.71042: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.71042: done checking to see if all hosts have failed 18714 1726853439.71043: getting the remaining hosts for this loop 18714 1726853439.71044: done getting the remaining hosts for this loop 18714 1726853439.71048: getting the next task for host managed_node1 18714 1726853439.71058: done getting next task for host managed_node1 18714 1726853439.71061: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18714 1726853439.71063: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.71078: getting variables 18714 1726853439.71080: in VariableManager get_vars() 18714 1726853439.71118: Calling all_inventory to load vars for managed_node1 18714 1726853439.71120: Calling groups_inventory to load vars for managed_node1 18714 1726853439.71122: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.71132: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.71135: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.71137: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.71685: done sending task result for task 02083763-bbaf-e784-4f7d-000000000061 18714 1726853439.71688: WORKER PROCESS EXITING 18714 1726853439.72955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.74502: done with get_vars() 18714 1726853439.74527: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18714 1726853439.74608: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:30:39 -0400 (0:00:00.109) 0:00:36.129 ****** 18714 1726853439.74640: entering _queue_task() for managed_node1/yum 18714 1726853439.74979: worker is 1 (out of 1 available) 18714 1726853439.74992: exiting _queue_task() for managed_node1/yum 18714 1726853439.75004: done queuing things up, now waiting for results queue to drain 18714 1726853439.75005: waiting for pending results... 18714 1726853439.75308: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18714 1726853439.75427: in run() - task 02083763-bbaf-e784-4f7d-000000000062 18714 1726853439.75454: variable 'ansible_search_path' from source: unknown 18714 1726853439.75465: variable 'ansible_search_path' from source: unknown 18714 1726853439.75516: calling self._execute() 18714 1726853439.75626: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.75639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.75659: variable 'omit' from source: magic vars 18714 1726853439.76067: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.76086: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.76367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853439.78449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853439.78523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853439.78567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853439.78608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853439.78640: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853439.78727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.78782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.78815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.78861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.78888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.78987: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.79008: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18714 1726853439.79177: when evaluation is False, skipping this task 18714 1726853439.79181: _execute() done 18714 1726853439.79184: dumping result to json 18714 1726853439.79186: done dumping result, returning 18714 1726853439.79189: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000062] 18714 1726853439.79191: sending task result for task 02083763-bbaf-e784-4f7d-000000000062 18714 1726853439.79269: done sending task result for task 02083763-bbaf-e784-4f7d-000000000062 18714 1726853439.79275: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18714 1726853439.79331: no more pending results, returning what we have 18714 1726853439.79336: results queue empty 18714 1726853439.79337: checking for any_errors_fatal 18714 1726853439.79345: done checking for any_errors_fatal 18714 1726853439.79345: checking for max_fail_percentage 18714 1726853439.79348: done checking for max_fail_percentage 18714 1726853439.79349: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.79350: done checking to see if all hosts have failed 18714 1726853439.79350: getting the remaining hosts for this loop 18714 1726853439.79355: done getting the remaining hosts for this loop 18714 1726853439.79359: getting the next task for host managed_node1 18714 1726853439.79367: done getting next task for host managed_node1 18714 1726853439.79373: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18714 1726853439.79375: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.79390: getting variables 18714 1726853439.79392: in VariableManager get_vars() 18714 1726853439.79437: Calling all_inventory to load vars for managed_node1 18714 1726853439.79440: Calling groups_inventory to load vars for managed_node1 18714 1726853439.79443: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.79458: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.79461: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.79465: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.81057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.82656: done with get_vars() 18714 1726853439.82682: done getting variables 18714 1726853439.82741: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:30:39 -0400 (0:00:00.081) 0:00:36.211 ****** 18714 1726853439.82778: entering _queue_task() for managed_node1/fail 18714 1726853439.83111: worker is 1 (out of 1 available) 18714 1726853439.83123: exiting _queue_task() for managed_node1/fail 18714 1726853439.83134: done queuing things up, now waiting for results queue to drain 18714 1726853439.83135: waiting for pending results... 18714 1726853439.83498: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18714 1726853439.83548: in run() - task 02083763-bbaf-e784-4f7d-000000000063 18714 1726853439.83575: variable 'ansible_search_path' from source: unknown 18714 1726853439.83583: variable 'ansible_search_path' from source: unknown 18714 1726853439.83625: calling self._execute() 18714 1726853439.83723: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.83776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.83780: variable 'omit' from source: magic vars 18714 1726853439.84129: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.84149: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.84279: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853439.84576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853439.87058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853439.87128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853439.87167: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853439.87207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853439.87233: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853439.87316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.87348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.87380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.87423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.87439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.87491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.87524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.87558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.87604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.87628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.87776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853439.87779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853439.87782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.87784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853439.87793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853439.87967: variable 'network_connections' from source: play vars 18714 1726853439.87987: variable 'profile' from source: play vars 18714 1726853439.88068: variable 'profile' from source: play vars 18714 1726853439.88080: variable 'interface' from source: set_fact 18714 1726853439.88147: variable 'interface' from source: set_fact 18714 1726853439.88231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853439.88417: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853439.88463: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853439.88500: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853439.88533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853439.88586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853439.88659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853439.88662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853439.88664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853439.88712: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853439.88950: variable 'network_connections' from source: play vars 18714 1726853439.88962: variable 'profile' from source: play vars 18714 1726853439.89024: variable 'profile' from source: play vars 18714 1726853439.89035: variable 'interface' from source: set_fact 18714 1726853439.89104: variable 'interface' from source: set_fact 18714 1726853439.89131: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853439.89202: when evaluation is False, skipping this task 18714 1726853439.89205: _execute() done 18714 1726853439.89208: dumping result to json 18714 1726853439.89210: done dumping result, returning 18714 1726853439.89213: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000063] 18714 1726853439.89223: sending task result for task 02083763-bbaf-e784-4f7d-000000000063 18714 1726853439.89297: done sending task result for task 02083763-bbaf-e784-4f7d-000000000063 18714 1726853439.89300: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853439.89360: no more pending results, returning what we have 18714 1726853439.89364: results queue empty 18714 1726853439.89365: checking for any_errors_fatal 18714 1726853439.89372: done checking for any_errors_fatal 18714 1726853439.89374: checking for max_fail_percentage 18714 1726853439.89376: done checking for max_fail_percentage 18714 1726853439.89377: checking to see if all hosts have failed and the running result is not ok 18714 1726853439.89377: done checking to see if all hosts have failed 18714 1726853439.89378: getting the remaining hosts for this loop 18714 1726853439.89379: done getting the remaining hosts for this loop 18714 1726853439.89383: getting the next task for host managed_node1 18714 1726853439.89391: done getting next task for host managed_node1 18714 1726853439.89394: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18714 1726853439.89396: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853439.89409: getting variables 18714 1726853439.89411: in VariableManager get_vars() 18714 1726853439.89455: Calling all_inventory to load vars for managed_node1 18714 1726853439.89458: Calling groups_inventory to load vars for managed_node1 18714 1726853439.89461: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853439.89775: Calling all_plugins_play to load vars for managed_node1 18714 1726853439.89780: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853439.89785: Calling groups_plugins_play to load vars for managed_node1 18714 1726853439.91322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853439.92908: done with get_vars() 18714 1726853439.92931: done getting variables 18714 1726853439.92995: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:30:39 -0400 (0:00:00.102) 0:00:36.313 ****** 18714 1726853439.93027: entering _queue_task() for managed_node1/package 18714 1726853439.93908: worker is 1 (out of 1 available) 18714 1726853439.93919: exiting _queue_task() for managed_node1/package 18714 1726853439.93929: done queuing things up, now waiting for results queue to drain 18714 1726853439.93929: waiting for pending results... 18714 1726853439.94488: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18714 1726853439.94679: in run() - task 02083763-bbaf-e784-4f7d-000000000064 18714 1726853439.94683: variable 'ansible_search_path' from source: unknown 18714 1726853439.94688: variable 'ansible_search_path' from source: unknown 18714 1726853439.94717: calling self._execute() 18714 1726853439.94864: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853439.94891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853439.94907: variable 'omit' from source: magic vars 18714 1726853439.95804: variable 'ansible_distribution_major_version' from source: facts 18714 1726853439.95808: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853439.96059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853439.96566: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853439.96721: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853439.96788: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853439.96967: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853439.97332: variable 'network_packages' from source: role '' defaults 18714 1726853439.97549: variable '__network_provider_setup' from source: role '' defaults 18714 1726853439.97555: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853439.97663: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853439.97679: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853439.97742: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853439.97936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853439.99946: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853440.00015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853440.00065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853440.00103: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853440.00135: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853440.00224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.00268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.00302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.00348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.00378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.00425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.00450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.00483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.00676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.00679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.00749: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18714 1726853440.00862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.00889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.00918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.00958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.00976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.01064: variable 'ansible_python' from source: facts 18714 1726853440.01093: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18714 1726853440.01182: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853440.01274: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853440.01419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.01454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.01487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.01530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.01557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.01612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.01650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.01690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.01733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.01756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.01987: variable 'network_connections' from source: play vars 18714 1726853440.01991: variable 'profile' from source: play vars 18714 1726853440.02022: variable 'profile' from source: play vars 18714 1726853440.02034: variable 'interface' from source: set_fact 18714 1726853440.02115: variable 'interface' from source: set_fact 18714 1726853440.02191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853440.02227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853440.02267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.02305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853440.02362: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853440.02663: variable 'network_connections' from source: play vars 18714 1726853440.02677: variable 'profile' from source: play vars 18714 1726853440.02767: variable 'profile' from source: play vars 18714 1726853440.02782: variable 'interface' from source: set_fact 18714 1726853440.02974: variable 'interface' from source: set_fact 18714 1726853440.02979: variable '__network_packages_default_wireless' from source: role '' defaults 18714 1726853440.02981: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853440.03277: variable 'network_connections' from source: play vars 18714 1726853440.03288: variable 'profile' from source: play vars 18714 1726853440.03359: variable 'profile' from source: play vars 18714 1726853440.03369: variable 'interface' from source: set_fact 18714 1726853440.03481: variable 'interface' from source: set_fact 18714 1726853440.03510: variable '__network_packages_default_team' from source: role '' defaults 18714 1726853440.03597: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853440.03913: variable 'network_connections' from source: play vars 18714 1726853440.03962: variable 'profile' from source: play vars 18714 1726853440.03998: variable 'profile' from source: play vars 18714 1726853440.04008: variable 'interface' from source: set_fact 18714 1726853440.04106: variable 'interface' from source: set_fact 18714 1726853440.04161: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853440.04233: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853440.04287: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853440.04320: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853440.04554: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18714 1726853440.05083: variable 'network_connections' from source: play vars 18714 1726853440.05095: variable 'profile' from source: play vars 18714 1726853440.05164: variable 'profile' from source: play vars 18714 1726853440.05269: variable 'interface' from source: set_fact 18714 1726853440.05277: variable 'interface' from source: set_fact 18714 1726853440.05280: variable 'ansible_distribution' from source: facts 18714 1726853440.05282: variable '__network_rh_distros' from source: role '' defaults 18714 1726853440.05284: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.05290: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18714 1726853440.05457: variable 'ansible_distribution' from source: facts 18714 1726853440.05468: variable '__network_rh_distros' from source: role '' defaults 18714 1726853440.05485: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.05504: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18714 1726853440.05669: variable 'ansible_distribution' from source: facts 18714 1726853440.05681: variable '__network_rh_distros' from source: role '' defaults 18714 1726853440.05691: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.05734: variable 'network_provider' from source: set_fact 18714 1726853440.05780: variable 'ansible_facts' from source: unknown 18714 1726853440.06704: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18714 1726853440.06713: when evaluation is False, skipping this task 18714 1726853440.06721: _execute() done 18714 1726853440.06726: dumping result to json 18714 1726853440.06732: done dumping result, returning 18714 1726853440.06742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-e784-4f7d-000000000064] 18714 1726853440.06749: sending task result for task 02083763-bbaf-e784-4f7d-000000000064 18714 1726853440.06859: done sending task result for task 02083763-bbaf-e784-4f7d-000000000064 18714 1726853440.06862: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18714 1726853440.06943: no more pending results, returning what we have 18714 1726853440.06947: results queue empty 18714 1726853440.06948: checking for any_errors_fatal 18714 1726853440.06957: done checking for any_errors_fatal 18714 1726853440.06958: checking for max_fail_percentage 18714 1726853440.06961: done checking for max_fail_percentage 18714 1726853440.06961: checking to see if all hosts have failed and the running result is not ok 18714 1726853440.06962: done checking to see if all hosts have failed 18714 1726853440.06963: getting the remaining hosts for this loop 18714 1726853440.06964: done getting the remaining hosts for this loop 18714 1726853440.06968: getting the next task for host managed_node1 18714 1726853440.07077: done getting next task for host managed_node1 18714 1726853440.07082: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18714 1726853440.07084: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853440.07097: getting variables 18714 1726853440.07099: in VariableManager get_vars() 18714 1726853440.07140: Calling all_inventory to load vars for managed_node1 18714 1726853440.07143: Calling groups_inventory to load vars for managed_node1 18714 1726853440.07146: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853440.07165: Calling all_plugins_play to load vars for managed_node1 18714 1726853440.07169: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853440.07378: Calling groups_plugins_play to load vars for managed_node1 18714 1726853440.08744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853440.10514: done with get_vars() 18714 1726853440.10537: done getting variables 18714 1726853440.10605: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:30:40 -0400 (0:00:00.176) 0:00:36.489 ****** 18714 1726853440.10638: entering _queue_task() for managed_node1/package 18714 1726853440.11399: worker is 1 (out of 1 available) 18714 1726853440.11413: exiting _queue_task() for managed_node1/package 18714 1726853440.11425: done queuing things up, now waiting for results queue to drain 18714 1726853440.11426: waiting for pending results... 18714 1726853440.11956: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18714 1726853440.12153: in run() - task 02083763-bbaf-e784-4f7d-000000000065 18714 1726853440.12156: variable 'ansible_search_path' from source: unknown 18714 1726853440.12160: variable 'ansible_search_path' from source: unknown 18714 1726853440.12162: calling self._execute() 18714 1726853440.12230: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853440.12243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853440.12264: variable 'omit' from source: magic vars 18714 1726853440.12665: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.12684: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853440.12817: variable 'network_state' from source: role '' defaults 18714 1726853440.12834: Evaluated conditional (network_state != {}): False 18714 1726853440.12842: when evaluation is False, skipping this task 18714 1726853440.12849: _execute() done 18714 1726853440.12855: dumping result to json 18714 1726853440.12862: done dumping result, returning 18714 1726853440.12874: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-e784-4f7d-000000000065] 18714 1726853440.12884: sending task result for task 02083763-bbaf-e784-4f7d-000000000065 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853440.13061: no more pending results, returning what we have 18714 1726853440.13065: results queue empty 18714 1726853440.13066: checking for any_errors_fatal 18714 1726853440.13177: done checking for any_errors_fatal 18714 1726853440.13178: checking for max_fail_percentage 18714 1726853440.13180: done checking for max_fail_percentage 18714 1726853440.13181: checking to see if all hosts have failed and the running result is not ok 18714 1726853440.13182: done checking to see if all hosts have failed 18714 1726853440.13183: getting the remaining hosts for this loop 18714 1726853440.13184: done getting the remaining hosts for this loop 18714 1726853440.13187: getting the next task for host managed_node1 18714 1726853440.13192: done getting next task for host managed_node1 18714 1726853440.13195: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18714 1726853440.13197: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853440.13214: getting variables 18714 1726853440.13216: in VariableManager get_vars() 18714 1726853440.13247: Calling all_inventory to load vars for managed_node1 18714 1726853440.13249: Calling groups_inventory to load vars for managed_node1 18714 1726853440.13253: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853440.13264: Calling all_plugins_play to load vars for managed_node1 18714 1726853440.13266: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853440.13269: Calling groups_plugins_play to load vars for managed_node1 18714 1726853440.13317: done sending task result for task 02083763-bbaf-e784-4f7d-000000000065 18714 1726853440.13319: WORKER PROCESS EXITING 18714 1726853440.15532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853440.17358: done with get_vars() 18714 1726853440.17386: done getting variables 18714 1726853440.17444: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:30:40 -0400 (0:00:00.070) 0:00:36.560 ****** 18714 1726853440.17683: entering _queue_task() for managed_node1/package 18714 1726853440.18207: worker is 1 (out of 1 available) 18714 1726853440.18221: exiting _queue_task() for managed_node1/package 18714 1726853440.18232: done queuing things up, now waiting for results queue to drain 18714 1726853440.18233: waiting for pending results... 18714 1726853440.18736: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18714 1726853440.19077: in run() - task 02083763-bbaf-e784-4f7d-000000000066 18714 1726853440.19081: variable 'ansible_search_path' from source: unknown 18714 1726853440.19084: variable 'ansible_search_path' from source: unknown 18714 1726853440.19133: calling self._execute() 18714 1726853440.19290: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853440.19350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853440.19391: variable 'omit' from source: magic vars 18714 1726853440.20049: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.20067: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853440.20208: variable 'network_state' from source: role '' defaults 18714 1726853440.20226: Evaluated conditional (network_state != {}): False 18714 1726853440.20276: when evaluation is False, skipping this task 18714 1726853440.20279: _execute() done 18714 1726853440.20282: dumping result to json 18714 1726853440.20284: done dumping result, returning 18714 1726853440.20287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-e784-4f7d-000000000066] 18714 1726853440.20289: sending task result for task 02083763-bbaf-e784-4f7d-000000000066 18714 1726853440.20493: done sending task result for task 02083763-bbaf-e784-4f7d-000000000066 18714 1726853440.20497: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853440.20544: no more pending results, returning what we have 18714 1726853440.20549: results queue empty 18714 1726853440.20549: checking for any_errors_fatal 18714 1726853440.20560: done checking for any_errors_fatal 18714 1726853440.20560: checking for max_fail_percentage 18714 1726853440.20562: done checking for max_fail_percentage 18714 1726853440.20563: checking to see if all hosts have failed and the running result is not ok 18714 1726853440.20563: done checking to see if all hosts have failed 18714 1726853440.20564: getting the remaining hosts for this loop 18714 1726853440.20565: done getting the remaining hosts for this loop 18714 1726853440.20568: getting the next task for host managed_node1 18714 1726853440.20577: done getting next task for host managed_node1 18714 1726853440.20581: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18714 1726853440.20582: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853440.20596: getting variables 18714 1726853440.20598: in VariableManager get_vars() 18714 1726853440.20631: Calling all_inventory to load vars for managed_node1 18714 1726853440.20634: Calling groups_inventory to load vars for managed_node1 18714 1726853440.20636: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853440.20645: Calling all_plugins_play to load vars for managed_node1 18714 1726853440.20647: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853440.20650: Calling groups_plugins_play to load vars for managed_node1 18714 1726853440.24042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853440.27744: done with get_vars() 18714 1726853440.27895: done getting variables 18714 1726853440.27961: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:30:40 -0400 (0:00:00.103) 0:00:36.664 ****** 18714 1726853440.28084: entering _queue_task() for managed_node1/service 18714 1726853440.28861: worker is 1 (out of 1 available) 18714 1726853440.28876: exiting _queue_task() for managed_node1/service 18714 1726853440.28888: done queuing things up, now waiting for results queue to drain 18714 1726853440.28889: waiting for pending results... 18714 1726853440.29386: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18714 1726853440.29540: in run() - task 02083763-bbaf-e784-4f7d-000000000067 18714 1726853440.29560: variable 'ansible_search_path' from source: unknown 18714 1726853440.29626: variable 'ansible_search_path' from source: unknown 18714 1726853440.29669: calling self._execute() 18714 1726853440.29825: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853440.30176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853440.30178: variable 'omit' from source: magic vars 18714 1726853440.30787: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.30865: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853440.30985: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853440.31459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853440.34020: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853440.34087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853440.34476: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853440.34480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853440.34482: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853440.34485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.34876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.34880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.34883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.34885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.34887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.34889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.34892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.35103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.35123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.35167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.35197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.35226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.35514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.35675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.35701: variable 'network_connections' from source: play vars 18714 1726853440.35718: variable 'profile' from source: play vars 18714 1726853440.35794: variable 'profile' from source: play vars 18714 1726853440.35978: variable 'interface' from source: set_fact 18714 1726853440.36041: variable 'interface' from source: set_fact 18714 1726853440.36124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853440.36534: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853440.36578: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853440.36614: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853440.36803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853440.36848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853440.36876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853440.36904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.36932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853440.36985: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853440.37509: variable 'network_connections' from source: play vars 18714 1726853440.37523: variable 'profile' from source: play vars 18714 1726853440.37587: variable 'profile' from source: play vars 18714 1726853440.37683: variable 'interface' from source: set_fact 18714 1726853440.37744: variable 'interface' from source: set_fact 18714 1726853440.37902: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18714 1726853440.37909: when evaluation is False, skipping this task 18714 1726853440.37916: _execute() done 18714 1726853440.37922: dumping result to json 18714 1726853440.37928: done dumping result, returning 18714 1726853440.37939: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-e784-4f7d-000000000067] 18714 1726853440.37955: sending task result for task 02083763-bbaf-e784-4f7d-000000000067 18714 1726853440.38061: done sending task result for task 02083763-bbaf-e784-4f7d-000000000067 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18714 1726853440.38113: no more pending results, returning what we have 18714 1726853440.38116: results queue empty 18714 1726853440.38117: checking for any_errors_fatal 18714 1726853440.38125: done checking for any_errors_fatal 18714 1726853440.38126: checking for max_fail_percentage 18714 1726853440.38127: done checking for max_fail_percentage 18714 1726853440.38128: checking to see if all hosts have failed and the running result is not ok 18714 1726853440.38128: done checking to see if all hosts have failed 18714 1726853440.38129: getting the remaining hosts for this loop 18714 1726853440.38130: done getting the remaining hosts for this loop 18714 1726853440.38134: getting the next task for host managed_node1 18714 1726853440.38141: done getting next task for host managed_node1 18714 1726853440.38145: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18714 1726853440.38146: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853440.38162: getting variables 18714 1726853440.38164: in VariableManager get_vars() 18714 1726853440.38207: Calling all_inventory to load vars for managed_node1 18714 1726853440.38210: Calling groups_inventory to load vars for managed_node1 18714 1726853440.38213: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853440.38224: Calling all_plugins_play to load vars for managed_node1 18714 1726853440.38228: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853440.38232: Calling groups_plugins_play to load vars for managed_node1 18714 1726853440.39678: WORKER PROCESS EXITING 18714 1726853440.41694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853440.43877: done with get_vars() 18714 1726853440.43907: done getting variables 18714 1726853440.43993: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:30:40 -0400 (0:00:00.159) 0:00:36.823 ****** 18714 1726853440.44026: entering _queue_task() for managed_node1/service 18714 1726853440.44527: worker is 1 (out of 1 available) 18714 1726853440.44540: exiting _queue_task() for managed_node1/service 18714 1726853440.44554: done queuing things up, now waiting for results queue to drain 18714 1726853440.44555: waiting for pending results... 18714 1726853440.44780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18714 1726853440.45077: in run() - task 02083763-bbaf-e784-4f7d-000000000068 18714 1726853440.45082: variable 'ansible_search_path' from source: unknown 18714 1726853440.45084: variable 'ansible_search_path' from source: unknown 18714 1726853440.45087: calling self._execute() 18714 1726853440.45089: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853440.45092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853440.45095: variable 'omit' from source: magic vars 18714 1726853440.45502: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.45514: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853440.45875: variable 'network_provider' from source: set_fact 18714 1726853440.45880: variable 'network_state' from source: role '' defaults 18714 1726853440.45883: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18714 1726853440.45886: variable 'omit' from source: magic vars 18714 1726853440.45889: variable 'omit' from source: magic vars 18714 1726853440.45891: variable 'network_service_name' from source: role '' defaults 18714 1726853440.45894: variable 'network_service_name' from source: role '' defaults 18714 1726853440.46002: variable '__network_provider_setup' from source: role '' defaults 18714 1726853440.46012: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853440.46082: variable '__network_service_name_default_nm' from source: role '' defaults 18714 1726853440.46090: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853440.46167: variable '__network_packages_default_nm' from source: role '' defaults 18714 1726853440.46414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853440.49050: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853440.49122: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853440.49159: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853440.49209: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853440.49241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853440.49338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.49372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.49396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.49447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.49464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.49523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.49546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.49579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.49609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.49641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.49925: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18714 1726853440.50289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.50313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.50332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.50375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.50396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.50541: variable 'ansible_python' from source: facts 18714 1726853440.50774: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18714 1726853440.50778: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853440.50860: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853440.51244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.51478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.51508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.51550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.51569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.51621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853440.51655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853440.51685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.51727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853440.51746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853440.51887: variable 'network_connections' from source: play vars 18714 1726853440.51900: variable 'profile' from source: play vars 18714 1726853440.51974: variable 'profile' from source: play vars 18714 1726853440.51984: variable 'interface' from source: set_fact 18714 1726853440.52041: variable 'interface' from source: set_fact 18714 1726853440.52277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853440.52322: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853440.52677: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853440.52681: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853440.52683: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853440.52685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853440.52797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853440.53276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853440.53280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853440.53282: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853440.53655: variable 'network_connections' from source: play vars 18714 1726853440.53786: variable 'profile' from source: play vars 18714 1726853440.53861: variable 'profile' from source: play vars 18714 1726853440.53874: variable 'interface' from source: set_fact 18714 1726853440.53935: variable 'interface' from source: set_fact 18714 1726853440.54108: variable '__network_packages_default_wireless' from source: role '' defaults 18714 1726853440.54188: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853440.54659: variable 'network_connections' from source: play vars 18714 1726853440.54885: variable 'profile' from source: play vars 18714 1726853440.54950: variable 'profile' from source: play vars 18714 1726853440.54961: variable 'interface' from source: set_fact 18714 1726853440.55026: variable 'interface' from source: set_fact 18714 1726853440.55100: variable '__network_packages_default_team' from source: role '' defaults 18714 1726853440.55358: variable '__network_team_connections_defined' from source: role '' defaults 18714 1726853440.55852: variable 'network_connections' from source: play vars 18714 1726853440.55986: variable 'profile' from source: play vars 18714 1726853440.56057: variable 'profile' from source: play vars 18714 1726853440.56067: variable 'interface' from source: set_fact 18714 1726853440.56145: variable 'interface' from source: set_fact 18714 1726853440.56330: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853440.56394: variable '__network_service_name_default_initscripts' from source: role '' defaults 18714 1726853440.56490: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853440.56551: variable '__network_packages_default_initscripts' from source: role '' defaults 18714 1726853440.56983: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18714 1726853440.57955: variable 'network_connections' from source: play vars 18714 1726853440.57984: variable 'profile' from source: play vars 18714 1726853440.58376: variable 'profile' from source: play vars 18714 1726853440.58379: variable 'interface' from source: set_fact 18714 1726853440.58382: variable 'interface' from source: set_fact 18714 1726853440.58384: variable 'ansible_distribution' from source: facts 18714 1726853440.58386: variable '__network_rh_distros' from source: role '' defaults 18714 1726853440.58388: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.58390: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18714 1726853440.58676: variable 'ansible_distribution' from source: facts 18714 1726853440.58685: variable '__network_rh_distros' from source: role '' defaults 18714 1726853440.58695: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.58713: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18714 1726853440.58995: variable 'ansible_distribution' from source: facts 18714 1726853440.59086: variable '__network_rh_distros' from source: role '' defaults 18714 1726853440.59099: variable 'ansible_distribution_major_version' from source: facts 18714 1726853440.59375: variable 'network_provider' from source: set_fact 18714 1726853440.59378: variable 'omit' from source: magic vars 18714 1726853440.59380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853440.59393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853440.59414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853440.59433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853440.59447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853440.59480: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853440.59776: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853440.59780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853440.59893: Set connection var ansible_shell_executable to /bin/sh 18714 1726853440.59906: Set connection var ansible_timeout to 10 18714 1726853440.59917: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853440.59928: Set connection var ansible_connection to ssh 18714 1726853440.59938: Set connection var ansible_shell_type to sh 18714 1726853440.59947: Set connection var ansible_pipelining to False 18714 1726853440.59976: variable 'ansible_shell_executable' from source: unknown 18714 1726853440.59985: variable 'ansible_connection' from source: unknown 18714 1726853440.59992: variable 'ansible_module_compression' from source: unknown 18714 1726853440.59999: variable 'ansible_shell_type' from source: unknown 18714 1726853440.60005: variable 'ansible_shell_executable' from source: unknown 18714 1726853440.60053: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853440.60070: variable 'ansible_pipelining' from source: unknown 18714 1726853440.60082: variable 'ansible_timeout' from source: unknown 18714 1726853440.60090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853440.60576: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853440.60581: variable 'omit' from source: magic vars 18714 1726853440.60583: starting attempt loop 18714 1726853440.60585: running the handler 18714 1726853440.60587: variable 'ansible_facts' from source: unknown 18714 1726853440.61277: _low_level_execute_command(): starting 18714 1726853440.61289: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853440.61956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853440.61974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853440.61992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853440.62011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853440.62029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853440.62059: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853440.62076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853440.62095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853440.62185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853440.62208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853440.62304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853440.64010: stdout chunk (state=3): >>>/root <<< 18714 1726853440.64134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853440.64182: stderr chunk (state=3): >>><<< 18714 1726853440.64223: stdout chunk (state=3): >>><<< 18714 1726853440.64380: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853440.64383: _low_level_execute_command(): starting 18714 1726853440.64386: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397 `" && echo ansible-tmp-1726853440.6433237-20421-45846381961397="` echo /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397 `" ) && sleep 0' 18714 1726853440.65686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853440.65773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853440.65782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853440.65792: stderr chunk (state=3): >>>debug2: match found <<< 18714 1726853440.65799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853440.66022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853440.66092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853440.66238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853440.68132: stdout chunk (state=3): >>>ansible-tmp-1726853440.6433237-20421-45846381961397=/root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397 <<< 18714 1726853440.68245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853440.68284: stderr chunk (state=3): >>><<< 18714 1726853440.68576: stdout chunk (state=3): >>><<< 18714 1726853440.68579: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853440.6433237-20421-45846381961397=/root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853440.68582: variable 'ansible_module_compression' from source: unknown 18714 1726853440.68588: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18714 1726853440.68591: variable 'ansible_facts' from source: unknown 18714 1726853440.68976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py 18714 1726853440.69001: Sending initial data 18714 1726853440.69003: Sent initial data (155 bytes) 18714 1726853440.69631: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853440.69640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853440.69651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853440.69976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853440.69980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853440.70113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853440.70170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853440.71725: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853440.71748: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853440.71834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4gz_pd10 /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py <<< 18714 1726853440.71838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py" <<< 18714 1726853440.71899: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18714 1726853440.71908: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4gz_pd10" to remote "/root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py" <<< 18714 1726853440.71917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py" <<< 18714 1726853440.75427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853440.75432: stdout chunk (state=3): >>><<< 18714 1726853440.75437: stderr chunk (state=3): >>><<< 18714 1726853440.75501: done transferring module to remote 18714 1726853440.75510: _low_level_execute_command(): starting 18714 1726853440.75515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/ /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py && sleep 0' 18714 1726853440.76412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853440.76429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853440.76445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853440.76465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853440.76538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853440.76568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853440.76587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853440.76600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853440.76751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853440.78707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853440.78716: stdout chunk (state=3): >>><<< 18714 1726853440.78725: stderr chunk (state=3): >>><<< 18714 1726853440.78740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853440.78747: _low_level_execute_command(): starting 18714 1726853440.78756: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/AnsiballZ_systemd.py && sleep 0' 18714 1726853440.79485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853440.79500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853440.79587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853440.79619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853440.79635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853440.79657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853440.79736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.08718: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10629120", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314688000", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "907444000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 18714 1726853441.08763: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18714 1726853441.10829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853441.10852: stderr chunk (state=3): >>><<< 18714 1726853441.10863: stdout chunk (state=3): >>><<< 18714 1726853441.10891: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10629120", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314688000", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "907444000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853441.11191: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853441.11194: _low_level_execute_command(): starting 18714 1726853441.11196: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853440.6433237-20421-45846381961397/ > /dev/null 2>&1 && sleep 0' 18714 1726853441.11730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.11742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.11784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.11855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853441.11876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.11907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.11979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.13856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853441.13869: stderr chunk (state=3): >>><<< 18714 1726853441.13882: stdout chunk (state=3): >>><<< 18714 1726853441.13903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853441.13916: handler run complete 18714 1726853441.14076: attempt loop complete, returning result 18714 1726853441.14080: _execute() done 18714 1726853441.14082: dumping result to json 18714 1726853441.14084: done dumping result, returning 18714 1726853441.14087: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-e784-4f7d-000000000068] 18714 1726853441.14089: sending task result for task 02083763-bbaf-e784-4f7d-000000000068 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853441.14510: no more pending results, returning what we have 18714 1726853441.14514: results queue empty 18714 1726853441.14516: checking for any_errors_fatal 18714 1726853441.14522: done checking for any_errors_fatal 18714 1726853441.14523: checking for max_fail_percentage 18714 1726853441.14525: done checking for max_fail_percentage 18714 1726853441.14526: checking to see if all hosts have failed and the running result is not ok 18714 1726853441.14527: done checking to see if all hosts have failed 18714 1726853441.14528: getting the remaining hosts for this loop 18714 1726853441.14529: done getting the remaining hosts for this loop 18714 1726853441.14533: getting the next task for host managed_node1 18714 1726853441.14541: done getting next task for host managed_node1 18714 1726853441.14545: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18714 1726853441.14547: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853441.14556: getting variables 18714 1726853441.14558: in VariableManager get_vars() 18714 1726853441.14633: Calling all_inventory to load vars for managed_node1 18714 1726853441.14636: Calling groups_inventory to load vars for managed_node1 18714 1726853441.14640: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853441.14651: Calling all_plugins_play to load vars for managed_node1 18714 1726853441.14654: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853441.14657: Calling groups_plugins_play to load vars for managed_node1 18714 1726853441.15184: done sending task result for task 02083763-bbaf-e784-4f7d-000000000068 18714 1726853441.15188: WORKER PROCESS EXITING 18714 1726853441.16293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853441.17777: done with get_vars() 18714 1726853441.17802: done getting variables 18714 1726853441.17861: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:30:41 -0400 (0:00:00.738) 0:00:37.562 ****** 18714 1726853441.17893: entering _queue_task() for managed_node1/service 18714 1726853441.18218: worker is 1 (out of 1 available) 18714 1726853441.18230: exiting _queue_task() for managed_node1/service 18714 1726853441.18240: done queuing things up, now waiting for results queue to drain 18714 1726853441.18241: waiting for pending results... 18714 1726853441.18518: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18714 1726853441.18616: in run() - task 02083763-bbaf-e784-4f7d-000000000069 18714 1726853441.18642: variable 'ansible_search_path' from source: unknown 18714 1726853441.18650: variable 'ansible_search_path' from source: unknown 18714 1726853441.18702: calling self._execute() 18714 1726853441.18816: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853441.18828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853441.18844: variable 'omit' from source: magic vars 18714 1726853441.19248: variable 'ansible_distribution_major_version' from source: facts 18714 1726853441.19267: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853441.19387: variable 'network_provider' from source: set_fact 18714 1726853441.19398: Evaluated conditional (network_provider == "nm"): True 18714 1726853441.19497: variable '__network_wpa_supplicant_required' from source: role '' defaults 18714 1726853441.19592: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18714 1726853441.19764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853441.21866: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853441.21935: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853441.21983: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853441.22022: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853441.22055: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853441.22155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853441.22276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853441.22279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853441.22282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853441.22284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853441.22336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853441.22364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853441.22397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853441.22442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853441.22461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853441.22509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853441.22538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853441.22567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853441.22611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853441.22637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853441.22790: variable 'network_connections' from source: play vars 18714 1726853441.22808: variable 'profile' from source: play vars 18714 1726853441.22951: variable 'profile' from source: play vars 18714 1726853441.22954: variable 'interface' from source: set_fact 18714 1726853441.22966: variable 'interface' from source: set_fact 18714 1726853441.23041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18714 1726853441.23206: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18714 1726853441.23250: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18714 1726853441.23290: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18714 1726853441.23328: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18714 1726853441.23373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18714 1726853441.23404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18714 1726853441.23437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853441.23463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18714 1726853441.23530: variable '__network_wireless_connections_defined' from source: role '' defaults 18714 1726853441.23792: variable 'network_connections' from source: play vars 18714 1726853441.23803: variable 'profile' from source: play vars 18714 1726853441.23927: variable 'profile' from source: play vars 18714 1726853441.23930: variable 'interface' from source: set_fact 18714 1726853441.23946: variable 'interface' from source: set_fact 18714 1726853441.23983: Evaluated conditional (__network_wpa_supplicant_required): False 18714 1726853441.23990: when evaluation is False, skipping this task 18714 1726853441.23996: _execute() done 18714 1726853441.24008: dumping result to json 18714 1726853441.24014: done dumping result, returning 18714 1726853441.24023: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-e784-4f7d-000000000069] 18714 1726853441.24035: sending task result for task 02083763-bbaf-e784-4f7d-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18714 1726853441.24263: no more pending results, returning what we have 18714 1726853441.24268: results queue empty 18714 1726853441.24269: checking for any_errors_fatal 18714 1726853441.24289: done checking for any_errors_fatal 18714 1726853441.24291: checking for max_fail_percentage 18714 1726853441.24293: done checking for max_fail_percentage 18714 1726853441.24294: checking to see if all hosts have failed and the running result is not ok 18714 1726853441.24295: done checking to see if all hosts have failed 18714 1726853441.24295: getting the remaining hosts for this loop 18714 1726853441.24297: done getting the remaining hosts for this loop 18714 1726853441.24301: getting the next task for host managed_node1 18714 1726853441.24310: done getting next task for host managed_node1 18714 1726853441.24315: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18714 1726853441.24319: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853441.24334: getting variables 18714 1726853441.24336: in VariableManager get_vars() 18714 1726853441.24594: Calling all_inventory to load vars for managed_node1 18714 1726853441.24598: Calling groups_inventory to load vars for managed_node1 18714 1726853441.24600: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853441.24611: Calling all_plugins_play to load vars for managed_node1 18714 1726853441.24614: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853441.24617: Calling groups_plugins_play to load vars for managed_node1 18714 1726853441.25206: done sending task result for task 02083763-bbaf-e784-4f7d-000000000069 18714 1726853441.25209: WORKER PROCESS EXITING 18714 1726853441.26100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853441.27860: done with get_vars() 18714 1726853441.27885: done getting variables 18714 1726853441.27949: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:30:41 -0400 (0:00:00.100) 0:00:37.663 ****** 18714 1726853441.27982: entering _queue_task() for managed_node1/service 18714 1726853441.28328: worker is 1 (out of 1 available) 18714 1726853441.28340: exiting _queue_task() for managed_node1/service 18714 1726853441.28353: done queuing things up, now waiting for results queue to drain 18714 1726853441.28354: waiting for pending results... 18714 1726853441.28634: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18714 1726853441.28738: in run() - task 02083763-bbaf-e784-4f7d-00000000006a 18714 1726853441.28760: variable 'ansible_search_path' from source: unknown 18714 1726853441.28768: variable 'ansible_search_path' from source: unknown 18714 1726853441.28816: calling self._execute() 18714 1726853441.28923: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853441.28935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853441.28950: variable 'omit' from source: magic vars 18714 1726853441.29339: variable 'ansible_distribution_major_version' from source: facts 18714 1726853441.29357: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853441.29468: variable 'network_provider' from source: set_fact 18714 1726853441.29482: Evaluated conditional (network_provider == "initscripts"): False 18714 1726853441.29489: when evaluation is False, skipping this task 18714 1726853441.29496: _execute() done 18714 1726853441.29503: dumping result to json 18714 1726853441.29510: done dumping result, returning 18714 1726853441.29522: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-e784-4f7d-00000000006a] 18714 1726853441.29531: sending task result for task 02083763-bbaf-e784-4f7d-00000000006a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18714 1726853441.29716: no more pending results, returning what we have 18714 1726853441.29721: results queue empty 18714 1726853441.29722: checking for any_errors_fatal 18714 1726853441.29733: done checking for any_errors_fatal 18714 1726853441.29734: checking for max_fail_percentage 18714 1726853441.29736: done checking for max_fail_percentage 18714 1726853441.29736: checking to see if all hosts have failed and the running result is not ok 18714 1726853441.29737: done checking to see if all hosts have failed 18714 1726853441.29738: getting the remaining hosts for this loop 18714 1726853441.29739: done getting the remaining hosts for this loop 18714 1726853441.29744: getting the next task for host managed_node1 18714 1726853441.29753: done getting next task for host managed_node1 18714 1726853441.29757: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18714 1726853441.29760: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853441.29781: getting variables 18714 1726853441.29783: in VariableManager get_vars() 18714 1726853441.29825: Calling all_inventory to load vars for managed_node1 18714 1726853441.29829: Calling groups_inventory to load vars for managed_node1 18714 1726853441.29832: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853441.29845: Calling all_plugins_play to load vars for managed_node1 18714 1726853441.29848: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853441.29851: Calling groups_plugins_play to load vars for managed_node1 18714 1726853441.30587: done sending task result for task 02083763-bbaf-e784-4f7d-00000000006a 18714 1726853441.30590: WORKER PROCESS EXITING 18714 1726853441.31477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853441.33080: done with get_vars() 18714 1726853441.33108: done getting variables 18714 1726853441.33166: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:30:41 -0400 (0:00:00.052) 0:00:37.715 ****** 18714 1726853441.33199: entering _queue_task() for managed_node1/copy 18714 1726853441.33526: worker is 1 (out of 1 available) 18714 1726853441.33650: exiting _queue_task() for managed_node1/copy 18714 1726853441.33660: done queuing things up, now waiting for results queue to drain 18714 1726853441.33661: waiting for pending results... 18714 1726853441.33830: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18714 1726853441.33942: in run() - task 02083763-bbaf-e784-4f7d-00000000006b 18714 1726853441.33965: variable 'ansible_search_path' from source: unknown 18714 1726853441.33981: variable 'ansible_search_path' from source: unknown 18714 1726853441.34022: calling self._execute() 18714 1726853441.34133: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853441.34144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853441.34159: variable 'omit' from source: magic vars 18714 1726853441.34547: variable 'ansible_distribution_major_version' from source: facts 18714 1726853441.34564: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853441.34684: variable 'network_provider' from source: set_fact 18714 1726853441.34694: Evaluated conditional (network_provider == "initscripts"): False 18714 1726853441.34700: when evaluation is False, skipping this task 18714 1726853441.34705: _execute() done 18714 1726853441.34711: dumping result to json 18714 1726853441.34717: done dumping result, returning 18714 1726853441.34727: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-e784-4f7d-00000000006b] 18714 1726853441.34740: sending task result for task 02083763-bbaf-e784-4f7d-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18714 1726853441.34992: no more pending results, returning what we have 18714 1726853441.34998: results queue empty 18714 1726853441.34999: checking for any_errors_fatal 18714 1726853441.35006: done checking for any_errors_fatal 18714 1726853441.35007: checking for max_fail_percentage 18714 1726853441.35009: done checking for max_fail_percentage 18714 1726853441.35010: checking to see if all hosts have failed and the running result is not ok 18714 1726853441.35011: done checking to see if all hosts have failed 18714 1726853441.35011: getting the remaining hosts for this loop 18714 1726853441.35013: done getting the remaining hosts for this loop 18714 1726853441.35017: getting the next task for host managed_node1 18714 1726853441.35025: done getting next task for host managed_node1 18714 1726853441.35030: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18714 1726853441.35033: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853441.35049: getting variables 18714 1726853441.35051: in VariableManager get_vars() 18714 1726853441.35303: Calling all_inventory to load vars for managed_node1 18714 1726853441.35306: Calling groups_inventory to load vars for managed_node1 18714 1726853441.35308: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853441.35314: done sending task result for task 02083763-bbaf-e784-4f7d-00000000006b 18714 1726853441.35317: WORKER PROCESS EXITING 18714 1726853441.35326: Calling all_plugins_play to load vars for managed_node1 18714 1726853441.35329: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853441.35332: Calling groups_plugins_play to load vars for managed_node1 18714 1726853441.36837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853441.38429: done with get_vars() 18714 1726853441.38451: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:30:41 -0400 (0:00:00.053) 0:00:37.768 ****** 18714 1726853441.38534: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18714 1726853441.38848: worker is 1 (out of 1 available) 18714 1726853441.38860: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18714 1726853441.39076: done queuing things up, now waiting for results queue to drain 18714 1726853441.39077: waiting for pending results... 18714 1726853441.39156: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18714 1726853441.39249: in run() - task 02083763-bbaf-e784-4f7d-00000000006c 18714 1726853441.39270: variable 'ansible_search_path' from source: unknown 18714 1726853441.39296: variable 'ansible_search_path' from source: unknown 18714 1726853441.39330: calling self._execute() 18714 1726853441.39476: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853441.39479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853441.39482: variable 'omit' from source: magic vars 18714 1726853441.39859: variable 'ansible_distribution_major_version' from source: facts 18714 1726853441.39880: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853441.39892: variable 'omit' from source: magic vars 18714 1726853441.39935: variable 'omit' from source: magic vars 18714 1726853441.40100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18714 1726853441.42332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18714 1726853441.42377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18714 1726853441.42414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18714 1726853441.42457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18714 1726853441.42549: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18714 1726853441.42574: variable 'network_provider' from source: set_fact 18714 1726853441.42702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18714 1726853441.42744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18714 1726853441.42780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18714 1726853441.42822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18714 1726853441.42839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18714 1726853441.42913: variable 'omit' from source: magic vars 18714 1726853441.43029: variable 'omit' from source: magic vars 18714 1726853441.43143: variable 'network_connections' from source: play vars 18714 1726853441.43161: variable 'profile' from source: play vars 18714 1726853441.43306: variable 'profile' from source: play vars 18714 1726853441.43310: variable 'interface' from source: set_fact 18714 1726853441.43313: variable 'interface' from source: set_fact 18714 1726853441.43462: variable 'omit' from source: magic vars 18714 1726853441.43478: variable '__lsr_ansible_managed' from source: task vars 18714 1726853441.43547: variable '__lsr_ansible_managed' from source: task vars 18714 1726853441.43830: Loaded config def from plugin (lookup/template) 18714 1726853441.43841: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18714 1726853441.43881: File lookup term: get_ansible_managed.j2 18714 1726853441.43889: variable 'ansible_search_path' from source: unknown 18714 1726853441.43899: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18714 1726853441.43915: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18714 1726853441.43936: variable 'ansible_search_path' from source: unknown 18714 1726853441.55478: variable 'ansible_managed' from source: unknown 18714 1726853441.55483: variable 'omit' from source: magic vars 18714 1726853441.55485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853441.55499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853441.55519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853441.55540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853441.55552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853441.55580: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853441.55589: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853441.55604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853441.55696: Set connection var ansible_shell_executable to /bin/sh 18714 1726853441.55711: Set connection var ansible_timeout to 10 18714 1726853441.55720: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853441.55728: Set connection var ansible_connection to ssh 18714 1726853441.55734: Set connection var ansible_shell_type to sh 18714 1726853441.55741: Set connection var ansible_pipelining to False 18714 1726853441.55762: variable 'ansible_shell_executable' from source: unknown 18714 1726853441.55768: variable 'ansible_connection' from source: unknown 18714 1726853441.55775: variable 'ansible_module_compression' from source: unknown 18714 1726853441.55780: variable 'ansible_shell_type' from source: unknown 18714 1726853441.55785: variable 'ansible_shell_executable' from source: unknown 18714 1726853441.55790: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853441.55795: variable 'ansible_pipelining' from source: unknown 18714 1726853441.55800: variable 'ansible_timeout' from source: unknown 18714 1726853441.55806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853441.55929: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853441.55976: variable 'omit' from source: magic vars 18714 1726853441.55979: starting attempt loop 18714 1726853441.55980: running the handler 18714 1726853441.55982: _low_level_execute_command(): starting 18714 1726853441.55984: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853441.56631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.56645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.56662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853441.56700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853441.56805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.56841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.56921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.58591: stdout chunk (state=3): >>>/root <<< 18714 1726853441.58687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853441.58877: stderr chunk (state=3): >>><<< 18714 1726853441.58881: stdout chunk (state=3): >>><<< 18714 1726853441.58884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853441.58887: _low_level_execute_command(): starting 18714 1726853441.58889: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468 `" && echo ansible-tmp-1726853441.5874453-20464-58602001610468="` echo /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468 `" ) && sleep 0' 18714 1726853441.59307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.59317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.59335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853441.59341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853441.59356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853441.59360: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853441.59372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.59387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853441.59444: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853441.59447: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853441.59449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.59454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853441.59456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853441.59458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853441.59460: stderr chunk (state=3): >>>debug2: match found <<< 18714 1726853441.59462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.59520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853441.59532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.59541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.59632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.61512: stdout chunk (state=3): >>>ansible-tmp-1726853441.5874453-20464-58602001610468=/root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468 <<< 18714 1726853441.61667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853441.61673: stdout chunk (state=3): >>><<< 18714 1726853441.61676: stderr chunk (state=3): >>><<< 18714 1726853441.61697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853441.5874453-20464-58602001610468=/root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853441.61880: variable 'ansible_module_compression' from source: unknown 18714 1726853441.61884: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18714 1726853441.61892: variable 'ansible_facts' from source: unknown 18714 1726853441.61989: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py 18714 1726853441.62221: Sending initial data 18714 1726853441.62224: Sent initial data (167 bytes) 18714 1726853441.62813: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.62824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.62872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.62940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853441.62955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.62981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.63049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.64592: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853441.64658: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853441.64714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpikyw0q2b /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py <<< 18714 1726853441.64717: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py" <<< 18714 1726853441.64756: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpikyw0q2b" to remote "/root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py" <<< 18714 1726853441.65968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853441.65973: stdout chunk (state=3): >>><<< 18714 1726853441.65976: stderr chunk (state=3): >>><<< 18714 1726853441.65978: done transferring module to remote 18714 1726853441.65990: _low_level_execute_command(): starting 18714 1726853441.66000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/ /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py && sleep 0' 18714 1726853441.66658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.66686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853441.66759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.66805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853441.66820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.66855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.66925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.68707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853441.68711: stderr chunk (state=3): >>><<< 18714 1726853441.68713: stdout chunk (state=3): >>><<< 18714 1726853441.68728: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853441.68736: _low_level_execute_command(): starting 18714 1726853441.68786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/AnsiballZ_network_connections.py && sleep 0' 18714 1726853441.69377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.69389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.69403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853441.69418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853441.69443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.69466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853441.69558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.69633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.69697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853441.96354: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rub556tb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rub556tb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/bd03661e-f09e-4a4d-b5cf-80038b20f631: error=unknown <<< 18714 1726853441.96507: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18714 1726853441.98327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853441.98345: stderr chunk (state=3): >>><<< 18714 1726853441.98352: stdout chunk (state=3): >>><<< 18714 1726853441.98376: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rub556tb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rub556tb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/bd03661e-f09e-4a4d-b5cf-80038b20f631: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853441.98489: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853441.98492: _low_level_execute_command(): starting 18714 1726853441.98494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853441.5874453-20464-58602001610468/ > /dev/null 2>&1 && sleep 0' 18714 1726853441.99035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853441.99051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853441.99067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853441.99127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853441.99189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853441.99206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853441.99247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853441.99310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.01181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853442.01206: stderr chunk (state=3): >>><<< 18714 1726853442.01276: stdout chunk (state=3): >>><<< 18714 1726853442.01280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853442.01283: handler run complete 18714 1726853442.01285: attempt loop complete, returning result 18714 1726853442.01287: _execute() done 18714 1726853442.01289: dumping result to json 18714 1726853442.01291: done dumping result, returning 18714 1726853442.01313: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-e784-4f7d-00000000006c] 18714 1726853442.01321: sending task result for task 02083763-bbaf-e784-4f7d-00000000006c changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18714 1726853442.01566: no more pending results, returning what we have 18714 1726853442.01569: results queue empty 18714 1726853442.01570: checking for any_errors_fatal 18714 1726853442.01579: done checking for any_errors_fatal 18714 1726853442.01580: checking for max_fail_percentage 18714 1726853442.01583: done checking for max_fail_percentage 18714 1726853442.01584: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.01585: done checking to see if all hosts have failed 18714 1726853442.01585: getting the remaining hosts for this loop 18714 1726853442.01587: done getting the remaining hosts for this loop 18714 1726853442.01591: getting the next task for host managed_node1 18714 1726853442.01597: done getting next task for host managed_node1 18714 1726853442.01601: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18714 1726853442.01603: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.01613: getting variables 18714 1726853442.01615: in VariableManager get_vars() 18714 1726853442.01650: Calling all_inventory to load vars for managed_node1 18714 1726853442.01653: Calling groups_inventory to load vars for managed_node1 18714 1726853442.01655: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.01665: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.01668: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.01883: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.01896: done sending task result for task 02083763-bbaf-e784-4f7d-00000000006c 18714 1726853442.01900: WORKER PROCESS EXITING 18714 1726853442.03463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.05997: done with get_vars() 18714 1726853442.06021: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:30:42 -0400 (0:00:00.675) 0:00:38.444 ****** 18714 1726853442.06118: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18714 1726853442.06893: worker is 1 (out of 1 available) 18714 1726853442.06905: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18714 1726853442.06915: done queuing things up, now waiting for results queue to drain 18714 1726853442.06917: waiting for pending results... 18714 1726853442.07425: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18714 1726853442.07592: in run() - task 02083763-bbaf-e784-4f7d-00000000006d 18714 1726853442.07686: variable 'ansible_search_path' from source: unknown 18714 1726853442.07793: variable 'ansible_search_path' from source: unknown 18714 1726853442.07835: calling self._execute() 18714 1726853442.07948: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.08077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.08113: variable 'omit' from source: magic vars 18714 1726853442.08860: variable 'ansible_distribution_major_version' from source: facts 18714 1726853442.08986: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853442.09176: variable 'network_state' from source: role '' defaults 18714 1726853442.09191: Evaluated conditional (network_state != {}): False 18714 1726853442.09311: when evaluation is False, skipping this task 18714 1726853442.09314: _execute() done 18714 1726853442.09316: dumping result to json 18714 1726853442.09318: done dumping result, returning 18714 1726853442.09320: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-e784-4f7d-00000000006d] 18714 1726853442.09322: sending task result for task 02083763-bbaf-e784-4f7d-00000000006d 18714 1726853442.09555: done sending task result for task 02083763-bbaf-e784-4f7d-00000000006d 18714 1726853442.09559: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18714 1726853442.09618: no more pending results, returning what we have 18714 1726853442.09623: results queue empty 18714 1726853442.09624: checking for any_errors_fatal 18714 1726853442.09634: done checking for any_errors_fatal 18714 1726853442.09635: checking for max_fail_percentage 18714 1726853442.09638: done checking for max_fail_percentage 18714 1726853442.09638: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.09639: done checking to see if all hosts have failed 18714 1726853442.09640: getting the remaining hosts for this loop 18714 1726853442.09641: done getting the remaining hosts for this loop 18714 1726853442.09644: getting the next task for host managed_node1 18714 1726853442.09655: done getting next task for host managed_node1 18714 1726853442.09658: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18714 1726853442.09661: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.09878: getting variables 18714 1726853442.09880: in VariableManager get_vars() 18714 1726853442.09916: Calling all_inventory to load vars for managed_node1 18714 1726853442.09919: Calling groups_inventory to load vars for managed_node1 18714 1726853442.09921: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.09931: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.09935: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.09938: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.16087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.17590: done with get_vars() 18714 1726853442.17612: done getting variables 18714 1726853442.17657: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:30:42 -0400 (0:00:00.115) 0:00:38.560 ****** 18714 1726853442.17682: entering _queue_task() for managed_node1/debug 18714 1726853442.18038: worker is 1 (out of 1 available) 18714 1726853442.18054: exiting _queue_task() for managed_node1/debug 18714 1726853442.18065: done queuing things up, now waiting for results queue to drain 18714 1726853442.18066: waiting for pending results... 18714 1726853442.18344: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18714 1726853442.18487: in run() - task 02083763-bbaf-e784-4f7d-00000000006e 18714 1726853442.18511: variable 'ansible_search_path' from source: unknown 18714 1726853442.18514: variable 'ansible_search_path' from source: unknown 18714 1726853442.18595: calling self._execute() 18714 1726853442.18664: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.18678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.18699: variable 'omit' from source: magic vars 18714 1726853442.19112: variable 'ansible_distribution_major_version' from source: facts 18714 1726853442.19129: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853442.19145: variable 'omit' from source: magic vars 18714 1726853442.19249: variable 'omit' from source: magic vars 18714 1726853442.19252: variable 'omit' from source: magic vars 18714 1726853442.19286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853442.19324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853442.19354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853442.19382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853442.19403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853442.19439: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853442.19449: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.19576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.19580: Set connection var ansible_shell_executable to /bin/sh 18714 1726853442.19586: Set connection var ansible_timeout to 10 18714 1726853442.19597: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853442.19610: Set connection var ansible_connection to ssh 18714 1726853442.19620: Set connection var ansible_shell_type to sh 18714 1726853442.19630: Set connection var ansible_pipelining to False 18714 1726853442.19655: variable 'ansible_shell_executable' from source: unknown 18714 1726853442.19664: variable 'ansible_connection' from source: unknown 18714 1726853442.19675: variable 'ansible_module_compression' from source: unknown 18714 1726853442.19692: variable 'ansible_shell_type' from source: unknown 18714 1726853442.19701: variable 'ansible_shell_executable' from source: unknown 18714 1726853442.19708: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.19716: variable 'ansible_pipelining' from source: unknown 18714 1726853442.19724: variable 'ansible_timeout' from source: unknown 18714 1726853442.19733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.19883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853442.19911: variable 'omit' from source: magic vars 18714 1726853442.20011: starting attempt loop 18714 1726853442.20016: running the handler 18714 1726853442.20061: variable '__network_connections_result' from source: set_fact 18714 1726853442.20120: handler run complete 18714 1726853442.20144: attempt loop complete, returning result 18714 1726853442.20150: _execute() done 18714 1726853442.20156: dumping result to json 18714 1726853442.20162: done dumping result, returning 18714 1726853442.20175: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-e784-4f7d-00000000006e] 18714 1726853442.20183: sending task result for task 02083763-bbaf-e784-4f7d-00000000006e ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 18714 1726853442.20348: no more pending results, returning what we have 18714 1726853442.20353: results queue empty 18714 1726853442.20354: checking for any_errors_fatal 18714 1726853442.20361: done checking for any_errors_fatal 18714 1726853442.20362: checking for max_fail_percentage 18714 1726853442.20364: done checking for max_fail_percentage 18714 1726853442.20364: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.20365: done checking to see if all hosts have failed 18714 1726853442.20365: getting the remaining hosts for this loop 18714 1726853442.20366: done getting the remaining hosts for this loop 18714 1726853442.20370: getting the next task for host managed_node1 18714 1726853442.20381: done getting next task for host managed_node1 18714 1726853442.20385: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18714 1726853442.20387: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.20400: getting variables 18714 1726853442.20402: in VariableManager get_vars() 18714 1726853442.20435: Calling all_inventory to load vars for managed_node1 18714 1726853442.20438: Calling groups_inventory to load vars for managed_node1 18714 1726853442.20440: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.20445: done sending task result for task 02083763-bbaf-e784-4f7d-00000000006e 18714 1726853442.20452: WORKER PROCESS EXITING 18714 1726853442.20580: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.20583: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.20586: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.21947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.23605: done with get_vars() 18714 1726853442.23628: done getting variables 18714 1726853442.23693: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:30:42 -0400 (0:00:00.060) 0:00:38.620 ****** 18714 1726853442.23726: entering _queue_task() for managed_node1/debug 18714 1726853442.24058: worker is 1 (out of 1 available) 18714 1726853442.24174: exiting _queue_task() for managed_node1/debug 18714 1726853442.24185: done queuing things up, now waiting for results queue to drain 18714 1726853442.24186: waiting for pending results... 18714 1726853442.24386: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18714 1726853442.24482: in run() - task 02083763-bbaf-e784-4f7d-00000000006f 18714 1726853442.24498: variable 'ansible_search_path' from source: unknown 18714 1726853442.24501: variable 'ansible_search_path' from source: unknown 18714 1726853442.24647: calling self._execute() 18714 1726853442.24651: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.24653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.24666: variable 'omit' from source: magic vars 18714 1726853442.25069: variable 'ansible_distribution_major_version' from source: facts 18714 1726853442.25083: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853442.25092: variable 'omit' from source: magic vars 18714 1726853442.25123: variable 'omit' from source: magic vars 18714 1726853442.25159: variable 'omit' from source: magic vars 18714 1726853442.25202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853442.25235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853442.25258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853442.25280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853442.25294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853442.25322: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853442.25326: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.25329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.25439: Set connection var ansible_shell_executable to /bin/sh 18714 1726853442.25445: Set connection var ansible_timeout to 10 18714 1726853442.25450: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853442.25462: Set connection var ansible_connection to ssh 18714 1726853442.25467: Set connection var ansible_shell_type to sh 18714 1726853442.25478: Set connection var ansible_pipelining to False 18714 1726853442.25499: variable 'ansible_shell_executable' from source: unknown 18714 1726853442.25502: variable 'ansible_connection' from source: unknown 18714 1726853442.25508: variable 'ansible_module_compression' from source: unknown 18714 1726853442.25511: variable 'ansible_shell_type' from source: unknown 18714 1726853442.25514: variable 'ansible_shell_executable' from source: unknown 18714 1726853442.25516: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.25519: variable 'ansible_pipelining' from source: unknown 18714 1726853442.25522: variable 'ansible_timeout' from source: unknown 18714 1726853442.25525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.25678: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853442.25682: variable 'omit' from source: magic vars 18714 1726853442.25685: starting attempt loop 18714 1726853442.25688: running the handler 18714 1726853442.25776: variable '__network_connections_result' from source: set_fact 18714 1726853442.25816: variable '__network_connections_result' from source: set_fact 18714 1726853442.25919: handler run complete 18714 1726853442.25945: attempt loop complete, returning result 18714 1726853442.25948: _execute() done 18714 1726853442.25950: dumping result to json 18714 1726853442.25953: done dumping result, returning 18714 1726853442.25963: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-e784-4f7d-00000000006f] 18714 1726853442.25965: sending task result for task 02083763-bbaf-e784-4f7d-00000000006f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18714 1726853442.26132: done sending task result for task 02083763-bbaf-e784-4f7d-00000000006f 18714 1726853442.26142: no more pending results, returning what we have 18714 1726853442.26145: results queue empty 18714 1726853442.26146: checking for any_errors_fatal 18714 1726853442.26151: done checking for any_errors_fatal 18714 1726853442.26154: checking for max_fail_percentage 18714 1726853442.26156: done checking for max_fail_percentage 18714 1726853442.26156: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.26157: done checking to see if all hosts have failed 18714 1726853442.26158: getting the remaining hosts for this loop 18714 1726853442.26160: done getting the remaining hosts for this loop 18714 1726853442.26164: getting the next task for host managed_node1 18714 1726853442.26173: done getting next task for host managed_node1 18714 1726853442.26177: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18714 1726853442.26178: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.26189: WORKER PROCESS EXITING 18714 1726853442.26377: getting variables 18714 1726853442.26379: in VariableManager get_vars() 18714 1726853442.26408: Calling all_inventory to load vars for managed_node1 18714 1726853442.26410: Calling groups_inventory to load vars for managed_node1 18714 1726853442.26412: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.26420: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.26423: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.26426: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.27856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.29437: done with get_vars() 18714 1726853442.29464: done getting variables 18714 1726853442.29524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:30:42 -0400 (0:00:00.058) 0:00:38.678 ****** 18714 1726853442.29562: entering _queue_task() for managed_node1/debug 18714 1726853442.29859: worker is 1 (out of 1 available) 18714 1726853442.29875: exiting _queue_task() for managed_node1/debug 18714 1726853442.29887: done queuing things up, now waiting for results queue to drain 18714 1726853442.29888: waiting for pending results... 18714 1726853442.30177: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18714 1726853442.30375: in run() - task 02083763-bbaf-e784-4f7d-000000000070 18714 1726853442.30379: variable 'ansible_search_path' from source: unknown 18714 1726853442.30386: variable 'ansible_search_path' from source: unknown 18714 1726853442.30389: calling self._execute() 18714 1726853442.30435: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.30439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.30450: variable 'omit' from source: magic vars 18714 1726853442.30853: variable 'ansible_distribution_major_version' from source: facts 18714 1726853442.30868: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853442.30991: variable 'network_state' from source: role '' defaults 18714 1726853442.31001: Evaluated conditional (network_state != {}): False 18714 1726853442.31005: when evaluation is False, skipping this task 18714 1726853442.31007: _execute() done 18714 1726853442.31010: dumping result to json 18714 1726853442.31013: done dumping result, returning 18714 1726853442.31023: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-e784-4f7d-000000000070] 18714 1726853442.31026: sending task result for task 02083763-bbaf-e784-4f7d-000000000070 18714 1726853442.31120: done sending task result for task 02083763-bbaf-e784-4f7d-000000000070 18714 1726853442.31176: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18714 1726853442.31226: no more pending results, returning what we have 18714 1726853442.31231: results queue empty 18714 1726853442.31232: checking for any_errors_fatal 18714 1726853442.31241: done checking for any_errors_fatal 18714 1726853442.31242: checking for max_fail_percentage 18714 1726853442.31244: done checking for max_fail_percentage 18714 1726853442.31245: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.31245: done checking to see if all hosts have failed 18714 1726853442.31246: getting the remaining hosts for this loop 18714 1726853442.31247: done getting the remaining hosts for this loop 18714 1726853442.31254: getting the next task for host managed_node1 18714 1726853442.31262: done getting next task for host managed_node1 18714 1726853442.31267: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18714 1726853442.31269: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.31286: getting variables 18714 1726853442.31288: in VariableManager get_vars() 18714 1726853442.31324: Calling all_inventory to load vars for managed_node1 18714 1726853442.31327: Calling groups_inventory to load vars for managed_node1 18714 1726853442.31329: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.31341: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.31343: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.31346: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.32728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.34499: done with get_vars() 18714 1726853442.34523: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:30:42 -0400 (0:00:00.050) 0:00:38.729 ****** 18714 1726853442.34626: entering _queue_task() for managed_node1/ping 18714 1726853442.34962: worker is 1 (out of 1 available) 18714 1726853442.34978: exiting _queue_task() for managed_node1/ping 18714 1726853442.34991: done queuing things up, now waiting for results queue to drain 18714 1726853442.34992: waiting for pending results... 18714 1726853442.35291: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18714 1726853442.35391: in run() - task 02083763-bbaf-e784-4f7d-000000000071 18714 1726853442.35432: variable 'ansible_search_path' from source: unknown 18714 1726853442.35440: variable 'ansible_search_path' from source: unknown 18714 1726853442.35443: calling self._execute() 18714 1726853442.35555: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.35577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.35581: variable 'omit' from source: magic vars 18714 1726853442.36087: variable 'ansible_distribution_major_version' from source: facts 18714 1726853442.36091: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853442.36095: variable 'omit' from source: magic vars 18714 1726853442.36098: variable 'omit' from source: magic vars 18714 1726853442.36100: variable 'omit' from source: magic vars 18714 1726853442.36112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853442.36153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853442.36178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853442.36195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853442.36208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853442.36239: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853442.36242: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.36251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.36353: Set connection var ansible_shell_executable to /bin/sh 18714 1726853442.36412: Set connection var ansible_timeout to 10 18714 1726853442.36415: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853442.36418: Set connection var ansible_connection to ssh 18714 1726853442.36420: Set connection var ansible_shell_type to sh 18714 1726853442.36423: Set connection var ansible_pipelining to False 18714 1726853442.36425: variable 'ansible_shell_executable' from source: unknown 18714 1726853442.36432: variable 'ansible_connection' from source: unknown 18714 1726853442.36435: variable 'ansible_module_compression' from source: unknown 18714 1726853442.36437: variable 'ansible_shell_type' from source: unknown 18714 1726853442.36439: variable 'ansible_shell_executable' from source: unknown 18714 1726853442.36442: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.36444: variable 'ansible_pipelining' from source: unknown 18714 1726853442.36446: variable 'ansible_timeout' from source: unknown 18714 1726853442.36448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.36739: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853442.36744: variable 'omit' from source: magic vars 18714 1726853442.36747: starting attempt loop 18714 1726853442.36749: running the handler 18714 1726853442.36751: _low_level_execute_command(): starting 18714 1726853442.36754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853442.37411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853442.37420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853442.37452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.37484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.37541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853442.37606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853442.37610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853442.37656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.39339: stdout chunk (state=3): >>>/root <<< 18714 1726853442.39498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853442.39502: stdout chunk (state=3): >>><<< 18714 1726853442.39504: stderr chunk (state=3): >>><<< 18714 1726853442.39627: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853442.39631: _low_level_execute_command(): starting 18714 1726853442.39635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058 `" && echo ansible-tmp-1726853442.3953326-20505-175687938519058="` echo /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058 `" ) && sleep 0' 18714 1726853442.40151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853442.40160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853442.40172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853442.40187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853442.40200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853442.40211: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853442.40214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.40285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853442.40288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853442.40293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853442.40295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853442.40301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853442.40304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853442.40306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853442.40308: stderr chunk (state=3): >>>debug2: match found <<< 18714 1726853442.40309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.40408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853442.40412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853442.40462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.42357: stdout chunk (state=3): >>>ansible-tmp-1726853442.3953326-20505-175687938519058=/root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058 <<< 18714 1726853442.42518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853442.42520: stdout chunk (state=3): >>><<< 18714 1726853442.42522: stderr chunk (state=3): >>><<< 18714 1726853442.42537: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853442.3953326-20505-175687938519058=/root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853442.42588: variable 'ansible_module_compression' from source: unknown 18714 1726853442.42642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18714 1726853442.42793: variable 'ansible_facts' from source: unknown 18714 1726853442.42882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py 18714 1726853442.43035: Sending initial data 18714 1726853442.43044: Sent initial data (153 bytes) 18714 1726853442.43714: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853442.43834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853442.43856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853442.43877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853442.43959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.45535: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853442.45600: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853442.45645: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp9feqsb29 /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py <<< 18714 1726853442.45661: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py" <<< 18714 1726853442.45704: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp9feqsb29" to remote "/root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py" <<< 18714 1726853442.46576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853442.46581: stderr chunk (state=3): >>><<< 18714 1726853442.46584: stdout chunk (state=3): >>><<< 18714 1726853442.46587: done transferring module to remote 18714 1726853442.46590: _low_level_execute_command(): starting 18714 1726853442.46593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/ /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py && sleep 0' 18714 1726853442.47323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853442.47336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853442.47359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.47461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853442.47494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.49477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853442.49481: stdout chunk (state=3): >>><<< 18714 1726853442.49484: stderr chunk (state=3): >>><<< 18714 1726853442.49487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853442.49489: _low_level_execute_command(): starting 18714 1726853442.49492: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/AnsiballZ_ping.py && sleep 0' 18714 1726853442.49965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853442.49974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853442.49986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853442.49999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853442.50010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853442.50016: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853442.50033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.50046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853442.50056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853442.50058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853442.50087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853442.50156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853442.50175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853442.50249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.65275: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18714 1726853442.66777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853442.66783: stdout chunk (state=3): >>><<< 18714 1726853442.66786: stderr chunk (state=3): >>><<< 18714 1726853442.66790: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853442.66793: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853442.66796: _low_level_execute_command(): starting 18714 1726853442.66799: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853442.3953326-20505-175687938519058/ > /dev/null 2>&1 && sleep 0' 18714 1726853442.67363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853442.67382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853442.67399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853442.67418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853442.67438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853442.67451: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853442.67552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853442.67574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853442.67644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853442.69519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853442.69531: stdout chunk (state=3): >>><<< 18714 1726853442.69595: stderr chunk (state=3): >>><<< 18714 1726853442.69618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853442.69629: handler run complete 18714 1726853442.69648: attempt loop complete, returning result 18714 1726853442.69877: _execute() done 18714 1726853442.69880: dumping result to json 18714 1726853442.69883: done dumping result, returning 18714 1726853442.69885: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-e784-4f7d-000000000071] 18714 1726853442.69887: sending task result for task 02083763-bbaf-e784-4f7d-000000000071 18714 1726853442.69961: done sending task result for task 02083763-bbaf-e784-4f7d-000000000071 18714 1726853442.69965: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18714 1726853442.70024: no more pending results, returning what we have 18714 1726853442.70026: results queue empty 18714 1726853442.70027: checking for any_errors_fatal 18714 1726853442.70034: done checking for any_errors_fatal 18714 1726853442.70035: checking for max_fail_percentage 18714 1726853442.70037: done checking for max_fail_percentage 18714 1726853442.70037: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.70038: done checking to see if all hosts have failed 18714 1726853442.70038: getting the remaining hosts for this loop 18714 1726853442.70039: done getting the remaining hosts for this loop 18714 1726853442.70045: getting the next task for host managed_node1 18714 1726853442.70055: done getting next task for host managed_node1 18714 1726853442.70057: ^ task is: TASK: meta (role_complete) 18714 1726853442.70058: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.70068: getting variables 18714 1726853442.70069: in VariableManager get_vars() 18714 1726853442.70108: Calling all_inventory to load vars for managed_node1 18714 1726853442.70111: Calling groups_inventory to load vars for managed_node1 18714 1726853442.70113: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.70123: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.70125: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.70127: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.72137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.74706: done with get_vars() 18714 1726853442.74840: done getting variables 18714 1726853442.74921: done queuing things up, now waiting for results queue to drain 18714 1726853442.74923: results queue empty 18714 1726853442.74924: checking for any_errors_fatal 18714 1726853442.74926: done checking for any_errors_fatal 18714 1726853442.74927: checking for max_fail_percentage 18714 1726853442.74928: done checking for max_fail_percentage 18714 1726853442.74929: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.74929: done checking to see if all hosts have failed 18714 1726853442.74930: getting the remaining hosts for this loop 18714 1726853442.74931: done getting the remaining hosts for this loop 18714 1726853442.74933: getting the next task for host managed_node1 18714 1726853442.74937: done getting next task for host managed_node1 18714 1726853442.74938: ^ task is: TASK: meta (flush_handlers) 18714 1726853442.74940: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.74942: getting variables 18714 1726853442.74943: in VariableManager get_vars() 18714 1726853442.74956: Calling all_inventory to load vars for managed_node1 18714 1726853442.74958: Calling groups_inventory to load vars for managed_node1 18714 1726853442.74960: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.74964: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.74966: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.74969: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.78381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.81594: done with get_vars() 18714 1726853442.81618: done getting variables 18714 1726853442.81677: in VariableManager get_vars() 18714 1726853442.81689: Calling all_inventory to load vars for managed_node1 18714 1726853442.81691: Calling groups_inventory to load vars for managed_node1 18714 1726853442.81693: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.81697: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.81699: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.81701: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.84068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.86289: done with get_vars() 18714 1726853442.86326: done queuing things up, now waiting for results queue to drain 18714 1726853442.86328: results queue empty 18714 1726853442.86329: checking for any_errors_fatal 18714 1726853442.86330: done checking for any_errors_fatal 18714 1726853442.86331: checking for max_fail_percentage 18714 1726853442.86339: done checking for max_fail_percentage 18714 1726853442.86340: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.86341: done checking to see if all hosts have failed 18714 1726853442.86342: getting the remaining hosts for this loop 18714 1726853442.86343: done getting the remaining hosts for this loop 18714 1726853442.86346: getting the next task for host managed_node1 18714 1726853442.86350: done getting next task for host managed_node1 18714 1726853442.86354: ^ task is: TASK: meta (flush_handlers) 18714 1726853442.86356: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.86363: getting variables 18714 1726853442.86364: in VariableManager get_vars() 18714 1726853442.86379: Calling all_inventory to load vars for managed_node1 18714 1726853442.86381: Calling groups_inventory to load vars for managed_node1 18714 1726853442.86383: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.86397: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.86400: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.86408: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.87750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.90118: done with get_vars() 18714 1726853442.90139: done getting variables 18714 1726853442.90315: in VariableManager get_vars() 18714 1726853442.90328: Calling all_inventory to load vars for managed_node1 18714 1726853442.90331: Calling groups_inventory to load vars for managed_node1 18714 1726853442.90333: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.90338: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.90340: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.90344: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.92000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.94013: done with get_vars() 18714 1726853442.94037: done queuing things up, now waiting for results queue to drain 18714 1726853442.94039: results queue empty 18714 1726853442.94040: checking for any_errors_fatal 18714 1726853442.94041: done checking for any_errors_fatal 18714 1726853442.94042: checking for max_fail_percentage 18714 1726853442.94043: done checking for max_fail_percentage 18714 1726853442.94043: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.94044: done checking to see if all hosts have failed 18714 1726853442.94045: getting the remaining hosts for this loop 18714 1726853442.94046: done getting the remaining hosts for this loop 18714 1726853442.94048: getting the next task for host managed_node1 18714 1726853442.94054: done getting next task for host managed_node1 18714 1726853442.94054: ^ task is: None 18714 1726853442.94056: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.94057: done queuing things up, now waiting for results queue to drain 18714 1726853442.94058: results queue empty 18714 1726853442.94059: checking for any_errors_fatal 18714 1726853442.94059: done checking for any_errors_fatal 18714 1726853442.94060: checking for max_fail_percentage 18714 1726853442.94061: done checking for max_fail_percentage 18714 1726853442.94062: checking to see if all hosts have failed and the running result is not ok 18714 1726853442.94062: done checking to see if all hosts have failed 18714 1726853442.94063: getting the next task for host managed_node1 18714 1726853442.94065: done getting next task for host managed_node1 18714 1726853442.94066: ^ task is: None 18714 1726853442.94067: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.94195: in VariableManager get_vars() 18714 1726853442.94324: done with get_vars() 18714 1726853442.94330: in VariableManager get_vars() 18714 1726853442.94339: done with get_vars() 18714 1726853442.94343: variable 'omit' from source: magic vars 18714 1726853442.94380: in VariableManager get_vars() 18714 1726853442.94391: done with get_vars() 18714 1726853442.94448: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18714 1726853442.94868: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853442.94892: getting the remaining hosts for this loop 18714 1726853442.94893: done getting the remaining hosts for this loop 18714 1726853442.94896: getting the next task for host managed_node1 18714 1726853442.94898: done getting next task for host managed_node1 18714 1726853442.94901: ^ task is: TASK: Gathering Facts 18714 1726853442.94902: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853442.94904: getting variables 18714 1726853442.94905: in VariableManager get_vars() 18714 1726853442.94914: Calling all_inventory to load vars for managed_node1 18714 1726853442.94917: Calling groups_inventory to load vars for managed_node1 18714 1726853442.94919: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853442.94924: Calling all_plugins_play to load vars for managed_node1 18714 1726853442.94927: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853442.94930: Calling groups_plugins_play to load vars for managed_node1 18714 1726853442.96283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853442.98003: done with get_vars() 18714 1726853442.98024: done getting variables 18714 1726853442.98124: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 13:30:42 -0400 (0:00:00.635) 0:00:39.364 ****** 18714 1726853442.98150: entering _queue_task() for managed_node1/gather_facts 18714 1726853442.98846: worker is 1 (out of 1 available) 18714 1726853442.98862: exiting _queue_task() for managed_node1/gather_facts 18714 1726853442.98995: done queuing things up, now waiting for results queue to drain 18714 1726853442.98996: waiting for pending results... 18714 1726853442.99303: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853442.99540: in run() - task 02083763-bbaf-e784-4f7d-0000000004e4 18714 1726853442.99546: variable 'ansible_search_path' from source: unknown 18714 1726853442.99549: calling self._execute() 18714 1726853442.99625: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853442.99636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853442.99659: variable 'omit' from source: magic vars 18714 1726853443.00069: variable 'ansible_distribution_major_version' from source: facts 18714 1726853443.00099: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853443.00194: variable 'omit' from source: magic vars 18714 1726853443.00197: variable 'omit' from source: magic vars 18714 1726853443.00199: variable 'omit' from source: magic vars 18714 1726853443.00230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853443.00273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853443.00302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853443.00328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853443.00344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853443.00382: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853443.00391: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853443.00398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853443.00504: Set connection var ansible_shell_executable to /bin/sh 18714 1726853443.00520: Set connection var ansible_timeout to 10 18714 1726853443.00535: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853443.00546: Set connection var ansible_connection to ssh 18714 1726853443.00558: Set connection var ansible_shell_type to sh 18714 1726853443.00567: Set connection var ansible_pipelining to False 18714 1726853443.00595: variable 'ansible_shell_executable' from source: unknown 18714 1726853443.00629: variable 'ansible_connection' from source: unknown 18714 1726853443.00634: variable 'ansible_module_compression' from source: unknown 18714 1726853443.00640: variable 'ansible_shell_type' from source: unknown 18714 1726853443.00642: variable 'ansible_shell_executable' from source: unknown 18714 1726853443.00644: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853443.00646: variable 'ansible_pipelining' from source: unknown 18714 1726853443.00648: variable 'ansible_timeout' from source: unknown 18714 1726853443.00649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853443.00847: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853443.00850: variable 'omit' from source: magic vars 18714 1726853443.00860: starting attempt loop 18714 1726853443.00862: running the handler 18714 1726853443.00965: variable 'ansible_facts' from source: unknown 18714 1726853443.00968: _low_level_execute_command(): starting 18714 1726853443.00972: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853443.01734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853443.01785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853443.01857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853443.01882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853443.02158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853443.02175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853443.03891: stdout chunk (state=3): >>>/root <<< 18714 1726853443.03993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853443.04011: stdout chunk (state=3): >>><<< 18714 1726853443.04058: stderr chunk (state=3): >>><<< 18714 1726853443.04186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853443.04190: _low_level_execute_command(): starting 18714 1726853443.04193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728 `" && echo ansible-tmp-1726853443.0409567-20532-209019139912728="` echo /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728 `" ) && sleep 0' 18714 1726853443.04786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853443.04790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853443.04839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853443.04869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853443.04933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853443.06862: stdout chunk (state=3): >>>ansible-tmp-1726853443.0409567-20532-209019139912728=/root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728 <<< 18714 1726853443.07113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853443.07116: stdout chunk (state=3): >>><<< 18714 1726853443.07119: stderr chunk (state=3): >>><<< 18714 1726853443.07122: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853443.0409567-20532-209019139912728=/root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853443.07163: variable 'ansible_module_compression' from source: unknown 18714 1726853443.07286: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853443.07396: variable 'ansible_facts' from source: unknown 18714 1726853443.07609: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py 18714 1726853443.07816: Sending initial data 18714 1726853443.07818: Sent initial data (154 bytes) 18714 1726853443.08465: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853443.08504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853443.08521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853443.08550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853443.08619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853443.10206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853443.10254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853443.10305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp2w9xjfv_ /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py <<< 18714 1726853443.10308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py" <<< 18714 1726853443.10355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp2w9xjfv_" to remote "/root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py" <<< 18714 1726853443.12144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853443.12242: stderr chunk (state=3): >>><<< 18714 1726853443.12249: stdout chunk (state=3): >>><<< 18714 1726853443.12254: done transferring module to remote 18714 1726853443.12257: _low_level_execute_command(): starting 18714 1726853443.12260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/ /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py && sleep 0' 18714 1726853443.13535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853443.13591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853443.13687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853443.13899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853443.13920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853443.13976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853443.15757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853443.15769: stdout chunk (state=3): >>><<< 18714 1726853443.15785: stderr chunk (state=3): >>><<< 18714 1726853443.15810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853443.15819: _low_level_execute_command(): starting 18714 1726853443.15945: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/AnsiballZ_setup.py && sleep 0' 18714 1726853443.16928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853443.16987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853443.17001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853443.17016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853443.17107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853443.17288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853443.17587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853443.81229: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "43", "epoch": "1726853443", "epoch_int": "1726853443", "date": "2024-09-20", "time": "13:30:43", "iso8601_micro": "2024-09-20T17:30:43.448464Z", "iso8601": "2024-09-20T17:30:43Z", "iso8601_basic": "20240920T133043448464", "iso8601_basic_short": "20240920T133043", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.2802734375, "5m": 0.330078125, "15m": 0.16748046875}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslconte<<< 18714 1726853443.81291: stdout chunk (state=3): >>>xt": true, "type": "cpython"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 609, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794951168, "block_size": 4096, "block_total": 65519099, "block_available": 63914783, "block_used": 1604316, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853443.83250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853443.83290: stderr chunk (state=3): >>><<< 18714 1726853443.83306: stdout chunk (state=3): >>><<< 18714 1726853443.83392: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "43", "epoch": "1726853443", "epoch_int": "1726853443", "date": "2024-09-20", "time": "13:30:43", "iso8601_micro": "2024-09-20T17:30:43.448464Z", "iso8601": "2024-09-20T17:30:43Z", "iso8601_basic": "20240920T133043448464", "iso8601_basic_short": "20240920T133043", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.2802734375, "5m": 0.330078125, "15m": 0.16748046875}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 609, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794951168, "block_size": 4096, "block_total": 65519099, "block_available": 63914783, "block_used": 1604316, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853443.84132: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853443.84198: _low_level_execute_command(): starting 18714 1726853443.84247: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853443.0409567-20532-209019139912728/ > /dev/null 2>&1 && sleep 0' 18714 1726853443.85308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853443.85329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853443.85347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853443.85439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853443.85476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853443.85493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853443.85510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853443.85581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853443.87777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853443.87780: stdout chunk (state=3): >>><<< 18714 1726853443.87783: stderr chunk (state=3): >>><<< 18714 1726853443.87785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853443.87788: handler run complete 18714 1726853443.87790: variable 'ansible_facts' from source: unknown 18714 1726853443.87983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853443.88316: variable 'ansible_facts' from source: unknown 18714 1726853443.88410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853443.88554: attempt loop complete, returning result 18714 1726853443.88595: _execute() done 18714 1726853443.88604: dumping result to json 18714 1726853443.88643: done dumping result, returning 18714 1726853443.88655: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-0000000004e4] 18714 1726853443.88663: sending task result for task 02083763-bbaf-e784-4f7d-0000000004e4 18714 1726853443.89339: done sending task result for task 02083763-bbaf-e784-4f7d-0000000004e4 18714 1726853443.89342: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853443.89813: no more pending results, returning what we have 18714 1726853443.89817: results queue empty 18714 1726853443.89818: checking for any_errors_fatal 18714 1726853443.89819: done checking for any_errors_fatal 18714 1726853443.89820: checking for max_fail_percentage 18714 1726853443.89822: done checking for max_fail_percentage 18714 1726853443.89822: checking to see if all hosts have failed and the running result is not ok 18714 1726853443.89823: done checking to see if all hosts have failed 18714 1726853443.89824: getting the remaining hosts for this loop 18714 1726853443.89825: done getting the remaining hosts for this loop 18714 1726853443.89829: getting the next task for host managed_node1 18714 1726853443.89835: done getting next task for host managed_node1 18714 1726853443.89837: ^ task is: TASK: meta (flush_handlers) 18714 1726853443.89838: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853443.89842: getting variables 18714 1726853443.89843: in VariableManager get_vars() 18714 1726853443.89881: Calling all_inventory to load vars for managed_node1 18714 1726853443.89884: Calling groups_inventory to load vars for managed_node1 18714 1726853443.89887: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853443.89903: Calling all_plugins_play to load vars for managed_node1 18714 1726853443.89907: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853443.89911: Calling groups_plugins_play to load vars for managed_node1 18714 1726853443.91821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853443.95258: done with get_vars() 18714 1726853443.95287: done getting variables 18714 1726853443.95470: in VariableManager get_vars() 18714 1726853443.95482: Calling all_inventory to load vars for managed_node1 18714 1726853443.95484: Calling groups_inventory to load vars for managed_node1 18714 1726853443.95486: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853443.95491: Calling all_plugins_play to load vars for managed_node1 18714 1726853443.95493: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853443.95496: Calling groups_plugins_play to load vars for managed_node1 18714 1726853443.97903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.00169: done with get_vars() 18714 1726853444.00204: done queuing things up, now waiting for results queue to drain 18714 1726853444.00206: results queue empty 18714 1726853444.00207: checking for any_errors_fatal 18714 1726853444.00212: done checking for any_errors_fatal 18714 1726853444.00212: checking for max_fail_percentage 18714 1726853444.00218: done checking for max_fail_percentage 18714 1726853444.00219: checking to see if all hosts have failed and the running result is not ok 18714 1726853444.00220: done checking to see if all hosts have failed 18714 1726853444.00220: getting the remaining hosts for this loop 18714 1726853444.00221: done getting the remaining hosts for this loop 18714 1726853444.00224: getting the next task for host managed_node1 18714 1726853444.00228: done getting next task for host managed_node1 18714 1726853444.00232: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18714 1726853444.00233: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853444.00236: getting variables 18714 1726853444.00237: in VariableManager get_vars() 18714 1726853444.00246: Calling all_inventory to load vars for managed_node1 18714 1726853444.00256: Calling groups_inventory to load vars for managed_node1 18714 1726853444.00259: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.00265: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.00267: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.00270: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.01586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.03301: done with get_vars() 18714 1726853444.03324: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 13:30:44 -0400 (0:00:01.052) 0:00:40.417 ****** 18714 1726853444.03407: entering _queue_task() for managed_node1/include_tasks 18714 1726853444.03791: worker is 1 (out of 1 available) 18714 1726853444.03803: exiting _queue_task() for managed_node1/include_tasks 18714 1726853444.03816: done queuing things up, now waiting for results queue to drain 18714 1726853444.03817: waiting for pending results... 18714 1726853444.04107: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 18714 1726853444.04281: in run() - task 02083763-bbaf-e784-4f7d-000000000074 18714 1726853444.04285: variable 'ansible_search_path' from source: unknown 18714 1726853444.04301: calling self._execute() 18714 1726853444.04403: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.04418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.04476: variable 'omit' from source: magic vars 18714 1726853444.04837: variable 'ansible_distribution_major_version' from source: facts 18714 1726853444.04854: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853444.04864: _execute() done 18714 1726853444.04874: dumping result to json 18714 1726853444.04883: done dumping result, returning 18714 1726853444.04896: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [02083763-bbaf-e784-4f7d-000000000074] 18714 1726853444.04905: sending task result for task 02083763-bbaf-e784-4f7d-000000000074 18714 1726853444.05059: no more pending results, returning what we have 18714 1726853444.05064: in VariableManager get_vars() 18714 1726853444.05103: Calling all_inventory to load vars for managed_node1 18714 1726853444.05106: Calling groups_inventory to load vars for managed_node1 18714 1726853444.05110: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.05123: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.05126: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.05129: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.05889: done sending task result for task 02083763-bbaf-e784-4f7d-000000000074 18714 1726853444.05892: WORKER PROCESS EXITING 18714 1726853444.06768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.08609: done with get_vars() 18714 1726853444.08636: variable 'ansible_search_path' from source: unknown 18714 1726853444.08659: we have included files to process 18714 1726853444.08660: generating all_blocks data 18714 1726853444.08661: done generating all_blocks data 18714 1726853444.08668: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18714 1726853444.08669: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18714 1726853444.08674: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18714 1726853444.08836: in VariableManager get_vars() 18714 1726853444.08860: done with get_vars() 18714 1726853444.08977: done processing included file 18714 1726853444.08979: iterating over new_blocks loaded from include file 18714 1726853444.08980: in VariableManager get_vars() 18714 1726853444.08992: done with get_vars() 18714 1726853444.08994: filtering new block on tags 18714 1726853444.09010: done filtering new block on tags 18714 1726853444.09012: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 18714 1726853444.09017: extending task lists for all hosts with included blocks 18714 1726853444.09066: done extending task lists 18714 1726853444.09068: done processing included files 18714 1726853444.09068: results queue empty 18714 1726853444.09069: checking for any_errors_fatal 18714 1726853444.09072: done checking for any_errors_fatal 18714 1726853444.09073: checking for max_fail_percentage 18714 1726853444.09074: done checking for max_fail_percentage 18714 1726853444.09075: checking to see if all hosts have failed and the running result is not ok 18714 1726853444.09075: done checking to see if all hosts have failed 18714 1726853444.09076: getting the remaining hosts for this loop 18714 1726853444.09077: done getting the remaining hosts for this loop 18714 1726853444.09080: getting the next task for host managed_node1 18714 1726853444.09083: done getting next task for host managed_node1 18714 1726853444.09085: ^ task is: TASK: Include the task 'get_profile_stat.yml' 18714 1726853444.09087: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853444.09089: getting variables 18714 1726853444.09090: in VariableManager get_vars() 18714 1726853444.09098: Calling all_inventory to load vars for managed_node1 18714 1726853444.09100: Calling groups_inventory to load vars for managed_node1 18714 1726853444.09102: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.09107: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.09109: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.09112: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.11792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.15298: done with get_vars() 18714 1726853444.15331: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:30:44 -0400 (0:00:00.121) 0:00:40.538 ****** 18714 1726853444.15522: entering _queue_task() for managed_node1/include_tasks 18714 1726853444.16250: worker is 1 (out of 1 available) 18714 1726853444.16265: exiting _queue_task() for managed_node1/include_tasks 18714 1726853444.16279: done queuing things up, now waiting for results queue to drain 18714 1726853444.16280: waiting for pending results... 18714 1726853444.16683: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 18714 1726853444.17052: in run() - task 02083763-bbaf-e784-4f7d-0000000004f5 18714 1726853444.17057: variable 'ansible_search_path' from source: unknown 18714 1726853444.17060: variable 'ansible_search_path' from source: unknown 18714 1726853444.17062: calling self._execute() 18714 1726853444.17209: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.17212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.17216: variable 'omit' from source: magic vars 18714 1726853444.18336: variable 'ansible_distribution_major_version' from source: facts 18714 1726853444.18400: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853444.18404: _execute() done 18714 1726853444.18508: dumping result to json 18714 1726853444.18511: done dumping result, returning 18714 1726853444.18519: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-e784-4f7d-0000000004f5] 18714 1726853444.18523: sending task result for task 02083763-bbaf-e784-4f7d-0000000004f5 18714 1726853444.18588: done sending task result for task 02083763-bbaf-e784-4f7d-0000000004f5 18714 1726853444.18590: WORKER PROCESS EXITING 18714 1726853444.18620: no more pending results, returning what we have 18714 1726853444.18632: in VariableManager get_vars() 18714 1726853444.18668: Calling all_inventory to load vars for managed_node1 18714 1726853444.18673: Calling groups_inventory to load vars for managed_node1 18714 1726853444.18676: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.18690: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.18693: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.18696: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.22853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.26393: done with get_vars() 18714 1726853444.26536: variable 'ansible_search_path' from source: unknown 18714 1726853444.26538: variable 'ansible_search_path' from source: unknown 18714 1726853444.26733: we have included files to process 18714 1726853444.26735: generating all_blocks data 18714 1726853444.26736: done generating all_blocks data 18714 1726853444.26737: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18714 1726853444.26738: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18714 1726853444.26741: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18714 1726853444.28394: done processing included file 18714 1726853444.28396: iterating over new_blocks loaded from include file 18714 1726853444.28398: in VariableManager get_vars() 18714 1726853444.28412: done with get_vars() 18714 1726853444.28414: filtering new block on tags 18714 1726853444.28450: done filtering new block on tags 18714 1726853444.28454: in VariableManager get_vars() 18714 1726853444.28475: done with get_vars() 18714 1726853444.28477: filtering new block on tags 18714 1726853444.28511: done filtering new block on tags 18714 1726853444.28513: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 18714 1726853444.28518: extending task lists for all hosts with included blocks 18714 1726853444.28647: done extending task lists 18714 1726853444.28649: done processing included files 18714 1726853444.28650: results queue empty 18714 1726853444.28650: checking for any_errors_fatal 18714 1726853444.28654: done checking for any_errors_fatal 18714 1726853444.28654: checking for max_fail_percentage 18714 1726853444.28655: done checking for max_fail_percentage 18714 1726853444.28656: checking to see if all hosts have failed and the running result is not ok 18714 1726853444.28657: done checking to see if all hosts have failed 18714 1726853444.28657: getting the remaining hosts for this loop 18714 1726853444.28659: done getting the remaining hosts for this loop 18714 1726853444.28661: getting the next task for host managed_node1 18714 1726853444.28665: done getting next task for host managed_node1 18714 1726853444.28668: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 18714 1726853444.28672: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853444.28675: getting variables 18714 1726853444.28675: in VariableManager get_vars() 18714 1726853444.28761: Calling all_inventory to load vars for managed_node1 18714 1726853444.28764: Calling groups_inventory to load vars for managed_node1 18714 1726853444.28767: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.28774: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.28777: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.28780: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.30149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.33340: done with get_vars() 18714 1726853444.33368: done getting variables 18714 1726853444.33414: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:30:44 -0400 (0:00:00.179) 0:00:40.717 ****** 18714 1726853444.33450: entering _queue_task() for managed_node1/set_fact 18714 1726853444.33813: worker is 1 (out of 1 available) 18714 1726853444.33825: exiting _queue_task() for managed_node1/set_fact 18714 1726853444.33838: done queuing things up, now waiting for results queue to drain 18714 1726853444.33838: waiting for pending results... 18714 1726853444.34295: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 18714 1726853444.34306: in run() - task 02083763-bbaf-e784-4f7d-000000000502 18714 1726853444.34310: variable 'ansible_search_path' from source: unknown 18714 1726853444.34313: variable 'ansible_search_path' from source: unknown 18714 1726853444.34316: calling self._execute() 18714 1726853444.34367: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.34373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.34386: variable 'omit' from source: magic vars 18714 1726853444.34779: variable 'ansible_distribution_major_version' from source: facts 18714 1726853444.34790: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853444.34795: variable 'omit' from source: magic vars 18714 1726853444.34842: variable 'omit' from source: magic vars 18714 1726853444.34947: variable 'omit' from source: magic vars 18714 1726853444.34951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853444.34966: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853444.34986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853444.35004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853444.35015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853444.35045: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853444.35048: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.35055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.35154: Set connection var ansible_shell_executable to /bin/sh 18714 1726853444.35163: Set connection var ansible_timeout to 10 18714 1726853444.35167: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853444.35181: Set connection var ansible_connection to ssh 18714 1726853444.35186: Set connection var ansible_shell_type to sh 18714 1726853444.35191: Set connection var ansible_pipelining to False 18714 1726853444.35213: variable 'ansible_shell_executable' from source: unknown 18714 1726853444.35216: variable 'ansible_connection' from source: unknown 18714 1726853444.35219: variable 'ansible_module_compression' from source: unknown 18714 1726853444.35221: variable 'ansible_shell_type' from source: unknown 18714 1726853444.35223: variable 'ansible_shell_executable' from source: unknown 18714 1726853444.35226: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.35230: variable 'ansible_pipelining' from source: unknown 18714 1726853444.35232: variable 'ansible_timeout' from source: unknown 18714 1726853444.35237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.35407: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853444.35410: variable 'omit' from source: magic vars 18714 1726853444.35413: starting attempt loop 18714 1726853444.35415: running the handler 18714 1726853444.35417: handler run complete 18714 1726853444.35426: attempt loop complete, returning result 18714 1726853444.35429: _execute() done 18714 1726853444.35432: dumping result to json 18714 1726853444.35434: done dumping result, returning 18714 1726853444.35442: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-e784-4f7d-000000000502] 18714 1726853444.35447: sending task result for task 02083763-bbaf-e784-4f7d-000000000502 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 18714 1726853444.35591: no more pending results, returning what we have 18714 1726853444.35597: results queue empty 18714 1726853444.35598: checking for any_errors_fatal 18714 1726853444.35600: done checking for any_errors_fatal 18714 1726853444.35601: checking for max_fail_percentage 18714 1726853444.35602: done checking for max_fail_percentage 18714 1726853444.35603: checking to see if all hosts have failed and the running result is not ok 18714 1726853444.35604: done checking to see if all hosts have failed 18714 1726853444.35605: getting the remaining hosts for this loop 18714 1726853444.35606: done getting the remaining hosts for this loop 18714 1726853444.35610: getting the next task for host managed_node1 18714 1726853444.35624: done getting next task for host managed_node1 18714 1726853444.35630: ^ task is: TASK: Stat profile file 18714 1726853444.35636: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853444.35643: getting variables 18714 1726853444.35645: in VariableManager get_vars() 18714 1726853444.35679: Calling all_inventory to load vars for managed_node1 18714 1726853444.35682: Calling groups_inventory to load vars for managed_node1 18714 1726853444.35686: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.35696: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.35699: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.35702: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.36229: done sending task result for task 02083763-bbaf-e784-4f7d-000000000502 18714 1726853444.36233: WORKER PROCESS EXITING 18714 1726853444.37684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.39333: done with get_vars() 18714 1726853444.39361: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:30:44 -0400 (0:00:00.060) 0:00:40.778 ****** 18714 1726853444.39463: entering _queue_task() for managed_node1/stat 18714 1726853444.39928: worker is 1 (out of 1 available) 18714 1726853444.39941: exiting _queue_task() for managed_node1/stat 18714 1726853444.39951: done queuing things up, now waiting for results queue to drain 18714 1726853444.39955: waiting for pending results... 18714 1726853444.40170: running TaskExecutor() for managed_node1/TASK: Stat profile file 18714 1726853444.40347: in run() - task 02083763-bbaf-e784-4f7d-000000000503 18714 1726853444.40352: variable 'ansible_search_path' from source: unknown 18714 1726853444.40355: variable 'ansible_search_path' from source: unknown 18714 1726853444.40358: calling self._execute() 18714 1726853444.40466: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.40475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.40486: variable 'omit' from source: magic vars 18714 1726853444.40912: variable 'ansible_distribution_major_version' from source: facts 18714 1726853444.40978: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853444.40981: variable 'omit' from source: magic vars 18714 1726853444.40984: variable 'omit' from source: magic vars 18714 1726853444.41085: variable 'profile' from source: include params 18714 1726853444.41089: variable 'interface' from source: set_fact 18714 1726853444.41165: variable 'interface' from source: set_fact 18714 1726853444.41192: variable 'omit' from source: magic vars 18714 1726853444.41347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853444.41351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853444.41354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853444.41357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853444.41359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853444.41361: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853444.41364: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.41366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.41488: Set connection var ansible_shell_executable to /bin/sh 18714 1726853444.41494: Set connection var ansible_timeout to 10 18714 1726853444.41500: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853444.41508: Set connection var ansible_connection to ssh 18714 1726853444.41513: Set connection var ansible_shell_type to sh 18714 1726853444.41518: Set connection var ansible_pipelining to False 18714 1726853444.41541: variable 'ansible_shell_executable' from source: unknown 18714 1726853444.41544: variable 'ansible_connection' from source: unknown 18714 1726853444.41547: variable 'ansible_module_compression' from source: unknown 18714 1726853444.41549: variable 'ansible_shell_type' from source: unknown 18714 1726853444.41562: variable 'ansible_shell_executable' from source: unknown 18714 1726853444.41565: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.41567: variable 'ansible_pipelining' from source: unknown 18714 1726853444.41570: variable 'ansible_timeout' from source: unknown 18714 1726853444.41575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.41978: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853444.41982: variable 'omit' from source: magic vars 18714 1726853444.41986: starting attempt loop 18714 1726853444.41990: running the handler 18714 1726853444.41993: _low_level_execute_command(): starting 18714 1726853444.41996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853444.42654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853444.42781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853444.42784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853444.42815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853444.44518: stdout chunk (state=3): >>>/root <<< 18714 1726853444.44628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853444.44641: stderr chunk (state=3): >>><<< 18714 1726853444.44649: stdout chunk (state=3): >>><<< 18714 1726853444.44683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853444.44777: _low_level_execute_command(): starting 18714 1726853444.44782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268 `" && echo ansible-tmp-1726853444.4469097-20605-56043446251268="` echo /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268 `" ) && sleep 0' 18714 1726853444.45341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853444.45356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853444.45384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853444.45442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853444.45504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853444.45522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853444.45567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853444.45612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853444.47533: stdout chunk (state=3): >>>ansible-tmp-1726853444.4469097-20605-56043446251268=/root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268 <<< 18714 1726853444.47661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853444.47664: stdout chunk (state=3): >>><<< 18714 1726853444.47677: stderr chunk (state=3): >>><<< 18714 1726853444.47713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853444.4469097-20605-56043446251268=/root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853444.47793: variable 'ansible_module_compression' from source: unknown 18714 1726853444.47882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18714 1726853444.47912: variable 'ansible_facts' from source: unknown 18714 1726853444.48009: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py 18714 1726853444.48290: Sending initial data 18714 1726853444.48293: Sent initial data (152 bytes) 18714 1726853444.49283: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853444.49316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853444.49367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853444.49385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853444.49454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853444.49504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853444.49538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853444.49720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853444.51157: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18714 1726853444.51218: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18714 1726853444.51222: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 18714 1726853444.51359: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853444.51398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853444.51482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpaxgjot1n /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py <<< 18714 1726853444.51488: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py" <<< 18714 1726853444.51607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpaxgjot1n" to remote "/root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py" <<< 18714 1726853444.53543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853444.53594: stdout chunk (state=3): >>><<< 18714 1726853444.53694: stderr chunk (state=3): >>><<< 18714 1726853444.53697: done transferring module to remote 18714 1726853444.53699: _low_level_execute_command(): starting 18714 1726853444.53701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/ /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py && sleep 0' 18714 1726853444.54588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853444.54616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853444.54718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853444.54742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853444.54886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853444.56792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853444.56797: stdout chunk (state=3): >>><<< 18714 1726853444.56799: stderr chunk (state=3): >>><<< 18714 1726853444.56802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853444.56804: _low_level_execute_command(): starting 18714 1726853444.56806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/AnsiballZ_stat.py && sleep 0' 18714 1726853444.57886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853444.58089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853444.58094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853444.58096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853444.58332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853444.73550: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18714 1726853444.75096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853444.75099: stdout chunk (state=3): >>><<< 18714 1726853444.75102: stderr chunk (state=3): >>><<< 18714 1726853444.75104: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853444.75108: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853444.75110: _low_level_execute_command(): starting 18714 1726853444.75112: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853444.4469097-20605-56043446251268/ > /dev/null 2>&1 && sleep 0' 18714 1726853444.76258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853444.76288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853444.76298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853444.76313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853444.76326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853444.76333: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853444.76342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853444.76646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853444.76689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853444.76892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853444.78740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853444.78782: stderr chunk (state=3): >>><<< 18714 1726853444.78986: stdout chunk (state=3): >>><<< 18714 1726853444.79003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853444.79010: handler run complete 18714 1726853444.79032: attempt loop complete, returning result 18714 1726853444.79035: _execute() done 18714 1726853444.79038: dumping result to json 18714 1726853444.79042: done dumping result, returning 18714 1726853444.79051: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-e784-4f7d-000000000503] 18714 1726853444.79058: sending task result for task 02083763-bbaf-e784-4f7d-000000000503 18714 1726853444.79170: done sending task result for task 02083763-bbaf-e784-4f7d-000000000503 18714 1726853444.79175: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18714 1726853444.79244: no more pending results, returning what we have 18714 1726853444.79248: results queue empty 18714 1726853444.79249: checking for any_errors_fatal 18714 1726853444.79260: done checking for any_errors_fatal 18714 1726853444.79261: checking for max_fail_percentage 18714 1726853444.79263: done checking for max_fail_percentage 18714 1726853444.79264: checking to see if all hosts have failed and the running result is not ok 18714 1726853444.79268: done checking to see if all hosts have failed 18714 1726853444.79269: getting the remaining hosts for this loop 18714 1726853444.79270: done getting the remaining hosts for this loop 18714 1726853444.79276: getting the next task for host managed_node1 18714 1726853444.79285: done getting next task for host managed_node1 18714 1726853444.79288: ^ task is: TASK: Set NM profile exist flag based on the profile files 18714 1726853444.79292: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853444.79296: getting variables 18714 1726853444.79297: in VariableManager get_vars() 18714 1726853444.79330: Calling all_inventory to load vars for managed_node1 18714 1726853444.79333: Calling groups_inventory to load vars for managed_node1 18714 1726853444.79336: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.79347: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.79349: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.79356: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.82630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853444.94189: done with get_vars() 18714 1726853444.94216: done getting variables 18714 1726853444.94267: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:30:44 -0400 (0:00:00.548) 0:00:41.326 ****** 18714 1726853444.94305: entering _queue_task() for managed_node1/set_fact 18714 1726853444.94752: worker is 1 (out of 1 available) 18714 1726853444.94765: exiting _queue_task() for managed_node1/set_fact 18714 1726853444.94778: done queuing things up, now waiting for results queue to drain 18714 1726853444.94780: waiting for pending results... 18714 1726853444.95218: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 18714 1726853444.95223: in run() - task 02083763-bbaf-e784-4f7d-000000000504 18714 1726853444.95226: variable 'ansible_search_path' from source: unknown 18714 1726853444.95228: variable 'ansible_search_path' from source: unknown 18714 1726853444.95243: calling self._execute() 18714 1726853444.95355: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853444.95367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853444.95385: variable 'omit' from source: magic vars 18714 1726853444.95831: variable 'ansible_distribution_major_version' from source: facts 18714 1726853444.95849: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853444.95986: variable 'profile_stat' from source: set_fact 18714 1726853444.96017: Evaluated conditional (profile_stat.stat.exists): False 18714 1726853444.96024: when evaluation is False, skipping this task 18714 1726853444.96032: _execute() done 18714 1726853444.96039: dumping result to json 18714 1726853444.96081: done dumping result, returning 18714 1726853444.96085: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-e784-4f7d-000000000504] 18714 1726853444.96087: sending task result for task 02083763-bbaf-e784-4f7d-000000000504 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18714 1726853444.96231: no more pending results, returning what we have 18714 1726853444.96235: results queue empty 18714 1726853444.96236: checking for any_errors_fatal 18714 1726853444.96246: done checking for any_errors_fatal 18714 1726853444.96246: checking for max_fail_percentage 18714 1726853444.96248: done checking for max_fail_percentage 18714 1726853444.96249: checking to see if all hosts have failed and the running result is not ok 18714 1726853444.96250: done checking to see if all hosts have failed 18714 1726853444.96251: getting the remaining hosts for this loop 18714 1726853444.96252: done getting the remaining hosts for this loop 18714 1726853444.96256: getting the next task for host managed_node1 18714 1726853444.96264: done getting next task for host managed_node1 18714 1726853444.96266: ^ task is: TASK: Get NM profile info 18714 1726853444.96270: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853444.96277: getting variables 18714 1726853444.96279: in VariableManager get_vars() 18714 1726853444.96311: Calling all_inventory to load vars for managed_node1 18714 1726853444.96314: Calling groups_inventory to load vars for managed_node1 18714 1726853444.96318: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853444.96331: Calling all_plugins_play to load vars for managed_node1 18714 1726853444.96334: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853444.96338: Calling groups_plugins_play to load vars for managed_node1 18714 1726853444.97131: done sending task result for task 02083763-bbaf-e784-4f7d-000000000504 18714 1726853444.97134: WORKER PROCESS EXITING 18714 1726853444.99567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853445.02969: done with get_vars() 18714 1726853445.03003: done getting variables 18714 1726853445.03222: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:30:45 -0400 (0:00:00.089) 0:00:41.415 ****** 18714 1726853445.03256: entering _queue_task() for managed_node1/shell 18714 1726853445.03372: Creating lock for shell 18714 1726853445.04052: worker is 1 (out of 1 available) 18714 1726853445.04066: exiting _queue_task() for managed_node1/shell 18714 1726853445.04133: done queuing things up, now waiting for results queue to drain 18714 1726853445.04134: waiting for pending results... 18714 1726853445.04570: running TaskExecutor() for managed_node1/TASK: Get NM profile info 18714 1726853445.04662: in run() - task 02083763-bbaf-e784-4f7d-000000000505 18714 1726853445.04897: variable 'ansible_search_path' from source: unknown 18714 1726853445.04906: variable 'ansible_search_path' from source: unknown 18714 1726853445.04926: calling self._execute() 18714 1726853445.05027: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.05034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.05044: variable 'omit' from source: magic vars 18714 1726853445.05844: variable 'ansible_distribution_major_version' from source: facts 18714 1726853445.05859: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853445.05866: variable 'omit' from source: magic vars 18714 1726853445.06117: variable 'omit' from source: magic vars 18714 1726853445.06219: variable 'profile' from source: include params 18714 1726853445.06223: variable 'interface' from source: set_fact 18714 1726853445.06500: variable 'interface' from source: set_fact 18714 1726853445.06523: variable 'omit' from source: magic vars 18714 1726853445.06561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853445.06598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853445.06620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853445.06638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853445.06651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853445.06887: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853445.06890: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.06893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.06995: Set connection var ansible_shell_executable to /bin/sh 18714 1726853445.07002: Set connection var ansible_timeout to 10 18714 1726853445.07007: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853445.07016: Set connection var ansible_connection to ssh 18714 1726853445.07021: Set connection var ansible_shell_type to sh 18714 1726853445.07027: Set connection var ansible_pipelining to False 18714 1726853445.07049: variable 'ansible_shell_executable' from source: unknown 18714 1726853445.07052: variable 'ansible_connection' from source: unknown 18714 1726853445.07069: variable 'ansible_module_compression' from source: unknown 18714 1726853445.07074: variable 'ansible_shell_type' from source: unknown 18714 1726853445.07076: variable 'ansible_shell_executable' from source: unknown 18714 1726853445.07078: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.07080: variable 'ansible_pipelining' from source: unknown 18714 1726853445.07083: variable 'ansible_timeout' from source: unknown 18714 1726853445.07386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.07415: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853445.07428: variable 'omit' from source: magic vars 18714 1726853445.07433: starting attempt loop 18714 1726853445.07436: running the handler 18714 1726853445.07446: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853445.07470: _low_level_execute_command(): starting 18714 1726853445.07712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853445.09279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853445.09282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.09284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853445.09287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853445.09289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853445.09290: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853445.09292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.09294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853445.09296: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853445.09298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853445.09299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.09301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853445.09303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853445.09304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853445.09306: stderr chunk (state=3): >>>debug2: match found <<< 18714 1726853445.09308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.09436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853445.09485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853445.11153: stdout chunk (state=3): >>>/root <<< 18714 1726853445.11445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853445.11450: stdout chunk (state=3): >>><<< 18714 1726853445.11462: stderr chunk (state=3): >>><<< 18714 1726853445.11485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853445.11498: _low_level_execute_command(): starting 18714 1726853445.11505: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695 `" && echo ansible-tmp-1726853445.1148486-20629-107989419401695="` echo /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695 `" ) && sleep 0' 18714 1726853445.12738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853445.12742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.12744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853445.12747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853445.12749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853445.12753: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853445.12755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.12757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853445.12759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853445.12761: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853445.12764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.12768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853445.12850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853445.13107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853445.13143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853445.15067: stdout chunk (state=3): >>>ansible-tmp-1726853445.1148486-20629-107989419401695=/root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695 <<< 18714 1726853445.15163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853445.15302: stderr chunk (state=3): >>><<< 18714 1726853445.15305: stdout chunk (state=3): >>><<< 18714 1726853445.15346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853445.1148486-20629-107989419401695=/root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853445.15364: variable 'ansible_module_compression' from source: unknown 18714 1726853445.15465: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853445.15473: variable 'ansible_facts' from source: unknown 18714 1726853445.15742: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py 18714 1726853445.16103: Sending initial data 18714 1726853445.16108: Sent initial data (156 bytes) 18714 1726853445.17404: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.17407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853445.17410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.17412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.17415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853445.17586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853445.17591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853445.17714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853445.19399: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853445.19438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853445.19800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp5wqxdqjx /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py <<< 18714 1726853445.19809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp5wqxdqjx" to remote "/root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py" <<< 18714 1726853445.22472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853445.22626: stderr chunk (state=3): >>><<< 18714 1726853445.22638: stdout chunk (state=3): >>><<< 18714 1726853445.22641: done transferring module to remote 18714 1726853445.22644: _low_level_execute_command(): starting 18714 1726853445.22646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/ /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py && sleep 0' 18714 1726853445.24765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.24790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853445.24813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853445.24882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853445.25003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853445.26793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853445.26803: stdout chunk (state=3): >>><<< 18714 1726853445.26814: stderr chunk (state=3): >>><<< 18714 1726853445.26866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853445.27021: _low_level_execute_command(): starting 18714 1726853445.27025: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/AnsiballZ_command.py && sleep 0' 18714 1726853445.28831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853445.29490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853445.29614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853445.46326: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 13:30:45.446425", "end": "2024-09-20 13:30:45.462359", "delta": "0:00:00.015934", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853445.47931: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. <<< 18714 1726853445.47935: stdout chunk (state=3): >>><<< 18714 1726853445.47937: stderr chunk (state=3): >>><<< 18714 1726853445.47941: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 13:30:45.446425", "end": "2024-09-20 13:30:45.462359", "delta": "0:00:00.015934", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. 18714 1726853445.48520: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853445.48528: _low_level_execute_command(): starting 18714 1726853445.48531: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853445.1148486-20629-107989419401695/ > /dev/null 2>&1 && sleep 0' 18714 1726853445.49805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853445.49808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.49811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853445.49813: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853445.49815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853445.49979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853445.50342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853445.50417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853445.52286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853445.52290: stdout chunk (state=3): >>><<< 18714 1726853445.52298: stderr chunk (state=3): >>><<< 18714 1726853445.52313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853445.52319: handler run complete 18714 1726853445.52345: Evaluated conditional (False): False 18714 1726853445.52356: attempt loop complete, returning result 18714 1726853445.52359: _execute() done 18714 1726853445.52362: dumping result to json 18714 1726853445.52364: done dumping result, returning 18714 1726853445.52380: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-e784-4f7d-000000000505] 18714 1726853445.52496: sending task result for task 02083763-bbaf-e784-4f7d-000000000505 18714 1726853445.52809: done sending task result for task 02083763-bbaf-e784-4f7d-000000000505 18714 1726853445.52812: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.015934", "end": "2024-09-20 13:30:45.462359", "rc": 1, "start": "2024-09-20 13:30:45.446425" } MSG: non-zero return code ...ignoring 18714 1726853445.53018: no more pending results, returning what we have 18714 1726853445.53021: results queue empty 18714 1726853445.53022: checking for any_errors_fatal 18714 1726853445.53027: done checking for any_errors_fatal 18714 1726853445.53028: checking for max_fail_percentage 18714 1726853445.53031: done checking for max_fail_percentage 18714 1726853445.53032: checking to see if all hosts have failed and the running result is not ok 18714 1726853445.53033: done checking to see if all hosts have failed 18714 1726853445.53034: getting the remaining hosts for this loop 18714 1726853445.53035: done getting the remaining hosts for this loop 18714 1726853445.53038: getting the next task for host managed_node1 18714 1726853445.53045: done getting next task for host managed_node1 18714 1726853445.53047: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18714 1726853445.53051: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853445.53055: getting variables 18714 1726853445.53056: in VariableManager get_vars() 18714 1726853445.53083: Calling all_inventory to load vars for managed_node1 18714 1726853445.53086: Calling groups_inventory to load vars for managed_node1 18714 1726853445.53088: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853445.53098: Calling all_plugins_play to load vars for managed_node1 18714 1726853445.53100: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853445.53103: Calling groups_plugins_play to load vars for managed_node1 18714 1726853445.57100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853445.60965: done with get_vars() 18714 1726853445.60991: done getting variables 18714 1726853445.61087: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:30:45 -0400 (0:00:00.578) 0:00:41.994 ****** 18714 1726853445.61120: entering _queue_task() for managed_node1/set_fact 18714 1726853445.61683: worker is 1 (out of 1 available) 18714 1726853445.61694: exiting _queue_task() for managed_node1/set_fact 18714 1726853445.61711: done queuing things up, now waiting for results queue to drain 18714 1726853445.61713: waiting for pending results... 18714 1726853445.61940: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18714 1726853445.62036: in run() - task 02083763-bbaf-e784-4f7d-000000000506 18714 1726853445.62144: variable 'ansible_search_path' from source: unknown 18714 1726853445.62149: variable 'ansible_search_path' from source: unknown 18714 1726853445.62152: calling self._execute() 18714 1726853445.62206: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.62218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.62233: variable 'omit' from source: magic vars 18714 1726853445.63064: variable 'ansible_distribution_major_version' from source: facts 18714 1726853445.63112: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853445.63266: variable 'nm_profile_exists' from source: set_fact 18714 1726853445.63410: Evaluated conditional (nm_profile_exists.rc == 0): False 18714 1726853445.63418: when evaluation is False, skipping this task 18714 1726853445.63424: _execute() done 18714 1726853445.63432: dumping result to json 18714 1726853445.63608: done dumping result, returning 18714 1726853445.63612: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-e784-4f7d-000000000506] 18714 1726853445.63614: sending task result for task 02083763-bbaf-e784-4f7d-000000000506 18714 1726853445.63683: done sending task result for task 02083763-bbaf-e784-4f7d-000000000506 18714 1726853445.63686: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 18714 1726853445.63756: no more pending results, returning what we have 18714 1726853445.63761: results queue empty 18714 1726853445.63762: checking for any_errors_fatal 18714 1726853445.63773: done checking for any_errors_fatal 18714 1726853445.63774: checking for max_fail_percentage 18714 1726853445.63776: done checking for max_fail_percentage 18714 1726853445.63777: checking to see if all hosts have failed and the running result is not ok 18714 1726853445.63778: done checking to see if all hosts have failed 18714 1726853445.63778: getting the remaining hosts for this loop 18714 1726853445.63780: done getting the remaining hosts for this loop 18714 1726853445.63783: getting the next task for host managed_node1 18714 1726853445.63795: done getting next task for host managed_node1 18714 1726853445.63798: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 18714 1726853445.63802: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853445.63806: getting variables 18714 1726853445.63808: in VariableManager get_vars() 18714 1726853445.63998: Calling all_inventory to load vars for managed_node1 18714 1726853445.64001: Calling groups_inventory to load vars for managed_node1 18714 1726853445.64005: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853445.64018: Calling all_plugins_play to load vars for managed_node1 18714 1726853445.64021: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853445.64024: Calling groups_plugins_play to load vars for managed_node1 18714 1726853445.66567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853445.71363: done with get_vars() 18714 1726853445.71589: done getting variables 18714 1726853445.71730: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853445.72134: variable 'profile' from source: include params 18714 1726853445.72210: variable 'interface' from source: set_fact 18714 1726853445.72498: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:30:45 -0400 (0:00:00.114) 0:00:42.109 ****** 18714 1726853445.72604: entering _queue_task() for managed_node1/command 18714 1726853445.73526: worker is 1 (out of 1 available) 18714 1726853445.73539: exiting _queue_task() for managed_node1/command 18714 1726853445.73630: done queuing things up, now waiting for results queue to drain 18714 1726853445.73631: waiting for pending results... 18714 1726853445.74249: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 18714 1726853445.74539: in run() - task 02083763-bbaf-e784-4f7d-000000000508 18714 1726853445.74543: variable 'ansible_search_path' from source: unknown 18714 1726853445.74546: variable 'ansible_search_path' from source: unknown 18714 1726853445.74549: calling self._execute() 18714 1726853445.74781: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.74789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.74800: variable 'omit' from source: magic vars 18714 1726853445.75436: variable 'ansible_distribution_major_version' from source: facts 18714 1726853445.75448: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853445.75579: variable 'profile_stat' from source: set_fact 18714 1726853445.75628: Evaluated conditional (profile_stat.stat.exists): False 18714 1726853445.75632: when evaluation is False, skipping this task 18714 1726853445.75635: _execute() done 18714 1726853445.75637: dumping result to json 18714 1726853445.75647: done dumping result, returning 18714 1726853445.75651: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 [02083763-bbaf-e784-4f7d-000000000508] 18714 1726853445.75653: sending task result for task 02083763-bbaf-e784-4f7d-000000000508 18714 1726853445.75716: done sending task result for task 02083763-bbaf-e784-4f7d-000000000508 18714 1726853445.75719: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18714 1726853445.75780: no more pending results, returning what we have 18714 1726853445.75783: results queue empty 18714 1726853445.75784: checking for any_errors_fatal 18714 1726853445.75792: done checking for any_errors_fatal 18714 1726853445.75792: checking for max_fail_percentage 18714 1726853445.75794: done checking for max_fail_percentage 18714 1726853445.75795: checking to see if all hosts have failed and the running result is not ok 18714 1726853445.75795: done checking to see if all hosts have failed 18714 1726853445.75796: getting the remaining hosts for this loop 18714 1726853445.75797: done getting the remaining hosts for this loop 18714 1726853445.75801: getting the next task for host managed_node1 18714 1726853445.75809: done getting next task for host managed_node1 18714 1726853445.75811: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 18714 1726853445.75814: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853445.75819: getting variables 18714 1726853445.75820: in VariableManager get_vars() 18714 1726853445.76021: Calling all_inventory to load vars for managed_node1 18714 1726853445.76024: Calling groups_inventory to load vars for managed_node1 18714 1726853445.76027: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853445.76037: Calling all_plugins_play to load vars for managed_node1 18714 1726853445.76039: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853445.76042: Calling groups_plugins_play to load vars for managed_node1 18714 1726853445.77943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853445.80494: done with get_vars() 18714 1726853445.80523: done getting variables 18714 1726853445.80710: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853445.80877: variable 'profile' from source: include params 18714 1726853445.80881: variable 'interface' from source: set_fact 18714 1726853445.80942: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:30:45 -0400 (0:00:00.083) 0:00:42.195 ****** 18714 1726853445.81191: entering _queue_task() for managed_node1/set_fact 18714 1726853445.81733: worker is 1 (out of 1 available) 18714 1726853445.81745: exiting _queue_task() for managed_node1/set_fact 18714 1726853445.81761: done queuing things up, now waiting for results queue to drain 18714 1726853445.81762: waiting for pending results... 18714 1726853445.82176: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 18714 1726853445.82182: in run() - task 02083763-bbaf-e784-4f7d-000000000509 18714 1726853445.82185: variable 'ansible_search_path' from source: unknown 18714 1726853445.82187: variable 'ansible_search_path' from source: unknown 18714 1726853445.82227: calling self._execute() 18714 1726853445.82524: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.82536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.82599: variable 'omit' from source: magic vars 18714 1726853445.83297: variable 'ansible_distribution_major_version' from source: facts 18714 1726853445.83363: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853445.83448: variable 'profile_stat' from source: set_fact 18714 1726853445.83478: Evaluated conditional (profile_stat.stat.exists): False 18714 1726853445.83486: when evaluation is False, skipping this task 18714 1726853445.83615: _execute() done 18714 1726853445.83691: dumping result to json 18714 1726853445.83694: done dumping result, returning 18714 1726853445.83697: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [02083763-bbaf-e784-4f7d-000000000509] 18714 1726853445.83699: sending task result for task 02083763-bbaf-e784-4f7d-000000000509 18714 1726853445.83768: done sending task result for task 02083763-bbaf-e784-4f7d-000000000509 18714 1726853445.83774: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18714 1726853445.83841: no more pending results, returning what we have 18714 1726853445.83845: results queue empty 18714 1726853445.83846: checking for any_errors_fatal 18714 1726853445.83854: done checking for any_errors_fatal 18714 1726853445.83855: checking for max_fail_percentage 18714 1726853445.83856: done checking for max_fail_percentage 18714 1726853445.83857: checking to see if all hosts have failed and the running result is not ok 18714 1726853445.83858: done checking to see if all hosts have failed 18714 1726853445.83859: getting the remaining hosts for this loop 18714 1726853445.83860: done getting the remaining hosts for this loop 18714 1726853445.83864: getting the next task for host managed_node1 18714 1726853445.83873: done getting next task for host managed_node1 18714 1726853445.83876: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 18714 1726853445.83879: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853445.83884: getting variables 18714 1726853445.83886: in VariableManager get_vars() 18714 1726853445.83920: Calling all_inventory to load vars for managed_node1 18714 1726853445.83923: Calling groups_inventory to load vars for managed_node1 18714 1726853445.83926: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853445.83939: Calling all_plugins_play to load vars for managed_node1 18714 1726853445.83941: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853445.83943: Calling groups_plugins_play to load vars for managed_node1 18714 1726853445.86393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853445.90458: done with get_vars() 18714 1726853445.90491: done getting variables 18714 1726853445.90666: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853445.90891: variable 'profile' from source: include params 18714 1726853445.90896: variable 'interface' from source: set_fact 18714 1726853445.91083: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:30:45 -0400 (0:00:00.099) 0:00:42.294 ****** 18714 1726853445.91120: entering _queue_task() for managed_node1/command 18714 1726853445.92000: worker is 1 (out of 1 available) 18714 1726853445.92011: exiting _queue_task() for managed_node1/command 18714 1726853445.92020: done queuing things up, now waiting for results queue to drain 18714 1726853445.92021: waiting for pending results... 18714 1726853445.92826: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 18714 1726853445.92970: in run() - task 02083763-bbaf-e784-4f7d-00000000050a 18714 1726853445.92984: variable 'ansible_search_path' from source: unknown 18714 1726853445.93004: variable 'ansible_search_path' from source: unknown 18714 1726853445.93291: calling self._execute() 18714 1726853445.93626: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853445.93630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853445.93633: variable 'omit' from source: magic vars 18714 1726853445.94568: variable 'ansible_distribution_major_version' from source: facts 18714 1726853445.94619: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853445.95059: variable 'profile_stat' from source: set_fact 18714 1726853445.95082: Evaluated conditional (profile_stat.stat.exists): False 18714 1726853445.95097: when evaluation is False, skipping this task 18714 1726853445.95105: _execute() done 18714 1726853445.95112: dumping result to json 18714 1726853445.95120: done dumping result, returning 18714 1726853445.95131: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 [02083763-bbaf-e784-4f7d-00000000050a] 18714 1726853445.95141: sending task result for task 02083763-bbaf-e784-4f7d-00000000050a 18714 1726853445.95349: done sending task result for task 02083763-bbaf-e784-4f7d-00000000050a 18714 1726853445.95355: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18714 1726853445.95434: no more pending results, returning what we have 18714 1726853445.95437: results queue empty 18714 1726853445.95551: checking for any_errors_fatal 18714 1726853445.95560: done checking for any_errors_fatal 18714 1726853445.95561: checking for max_fail_percentage 18714 1726853445.95562: done checking for max_fail_percentage 18714 1726853445.95563: checking to see if all hosts have failed and the running result is not ok 18714 1726853445.95564: done checking to see if all hosts have failed 18714 1726853445.95564: getting the remaining hosts for this loop 18714 1726853445.95565: done getting the remaining hosts for this loop 18714 1726853445.95570: getting the next task for host managed_node1 18714 1726853445.95579: done getting next task for host managed_node1 18714 1726853445.95581: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 18714 1726853445.95585: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853445.95589: getting variables 18714 1726853445.95591: in VariableManager get_vars() 18714 1726853445.95621: Calling all_inventory to load vars for managed_node1 18714 1726853445.95623: Calling groups_inventory to load vars for managed_node1 18714 1726853445.95627: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853445.95639: Calling all_plugins_play to load vars for managed_node1 18714 1726853445.95641: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853445.95644: Calling groups_plugins_play to load vars for managed_node1 18714 1726853445.99085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.02846: done with get_vars() 18714 1726853446.02903: done getting variables 18714 1726853446.02963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853446.03182: variable 'profile' from source: include params 18714 1726853446.03186: variable 'interface' from source: set_fact 18714 1726853446.03357: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:30:46 -0400 (0:00:00.123) 0:00:42.418 ****** 18714 1726853446.03549: entering _queue_task() for managed_node1/set_fact 18714 1726853446.04473: worker is 1 (out of 1 available) 18714 1726853446.04486: exiting _queue_task() for managed_node1/set_fact 18714 1726853446.04722: done queuing things up, now waiting for results queue to drain 18714 1726853446.04724: waiting for pending results... 18714 1726853446.05703: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 18714 1726853446.05720: in run() - task 02083763-bbaf-e784-4f7d-00000000050b 18714 1726853446.05724: variable 'ansible_search_path' from source: unknown 18714 1726853446.05727: variable 'ansible_search_path' from source: unknown 18714 1726853446.05730: calling self._execute() 18714 1726853446.05889: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.06049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.06067: variable 'omit' from source: magic vars 18714 1726853446.07778: variable 'ansible_distribution_major_version' from source: facts 18714 1726853446.07784: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853446.08127: variable 'profile_stat' from source: set_fact 18714 1726853446.08321: Evaluated conditional (profile_stat.stat.exists): False 18714 1726853446.08325: when evaluation is False, skipping this task 18714 1726853446.08328: _execute() done 18714 1726853446.08330: dumping result to json 18714 1726853446.08332: done dumping result, returning 18714 1726853446.08335: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 [02083763-bbaf-e784-4f7d-00000000050b] 18714 1726853446.08336: sending task result for task 02083763-bbaf-e784-4f7d-00000000050b skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18714 1726853446.08491: no more pending results, returning what we have 18714 1726853446.08496: results queue empty 18714 1726853446.08497: checking for any_errors_fatal 18714 1726853446.08504: done checking for any_errors_fatal 18714 1726853446.08504: checking for max_fail_percentage 18714 1726853446.08506: done checking for max_fail_percentage 18714 1726853446.08507: checking to see if all hosts have failed and the running result is not ok 18714 1726853446.08508: done checking to see if all hosts have failed 18714 1726853446.08508: getting the remaining hosts for this loop 18714 1726853446.08509: done getting the remaining hosts for this loop 18714 1726853446.08513: getting the next task for host managed_node1 18714 1726853446.08522: done getting next task for host managed_node1 18714 1726853446.08524: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 18714 1726853446.08527: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853446.08532: getting variables 18714 1726853446.08534: in VariableManager get_vars() 18714 1726853446.08565: Calling all_inventory to load vars for managed_node1 18714 1726853446.08568: Calling groups_inventory to load vars for managed_node1 18714 1726853446.08748: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.08764: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.08767: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.08811: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.09586: done sending task result for task 02083763-bbaf-e784-4f7d-00000000050b 18714 1726853446.09590: WORKER PROCESS EXITING 18714 1726853446.10603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.12985: done with get_vars() 18714 1726853446.13008: done getting variables 18714 1726853446.13189: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853446.13338: variable 'profile' from source: include params 18714 1726853446.13341: variable 'interface' from source: set_fact 18714 1726853446.13514: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:30:46 -0400 (0:00:00.099) 0:00:42.518 ****** 18714 1726853446.13546: entering _queue_task() for managed_node1/assert 18714 1726853446.14364: worker is 1 (out of 1 available) 18714 1726853446.14377: exiting _queue_task() for managed_node1/assert 18714 1726853446.14387: done queuing things up, now waiting for results queue to drain 18714 1726853446.14388: waiting for pending results... 18714 1726853446.14888: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' 18714 1726853446.14997: in run() - task 02083763-bbaf-e784-4f7d-0000000004f6 18714 1726853446.15021: variable 'ansible_search_path' from source: unknown 18714 1726853446.15025: variable 'ansible_search_path' from source: unknown 18714 1726853446.15064: calling self._execute() 18714 1726853446.15287: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.15291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.15294: variable 'omit' from source: magic vars 18714 1726853446.15948: variable 'ansible_distribution_major_version' from source: facts 18714 1726853446.15961: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853446.15967: variable 'omit' from source: magic vars 18714 1726853446.16225: variable 'omit' from source: magic vars 18714 1726853446.16394: variable 'profile' from source: include params 18714 1726853446.16398: variable 'interface' from source: set_fact 18714 1726853446.16468: variable 'interface' from source: set_fact 18714 1726853446.16488: variable 'omit' from source: magic vars 18714 1726853446.16647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853446.16684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853446.16706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853446.16722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853446.16735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853446.16885: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853446.16889: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.16891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.17138: Set connection var ansible_shell_executable to /bin/sh 18714 1726853446.17141: Set connection var ansible_timeout to 10 18714 1726853446.17143: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853446.17150: Set connection var ansible_connection to ssh 18714 1726853446.17155: Set connection var ansible_shell_type to sh 18714 1726853446.17158: Set connection var ansible_pipelining to False 18714 1726853446.17160: variable 'ansible_shell_executable' from source: unknown 18714 1726853446.17162: variable 'ansible_connection' from source: unknown 18714 1726853446.17165: variable 'ansible_module_compression' from source: unknown 18714 1726853446.17167: variable 'ansible_shell_type' from source: unknown 18714 1726853446.17169: variable 'ansible_shell_executable' from source: unknown 18714 1726853446.17172: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.17175: variable 'ansible_pipelining' from source: unknown 18714 1726853446.17178: variable 'ansible_timeout' from source: unknown 18714 1726853446.17361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.17607: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853446.17625: variable 'omit' from source: magic vars 18714 1726853446.17631: starting attempt loop 18714 1726853446.17634: running the handler 18714 1726853446.17862: variable 'lsr_net_profile_exists' from source: set_fact 18714 1726853446.17865: Evaluated conditional (not lsr_net_profile_exists): True 18714 1726853446.17874: handler run complete 18714 1726853446.17889: attempt loop complete, returning result 18714 1726853446.17892: _execute() done 18714 1726853446.17894: dumping result to json 18714 1726853446.17978: done dumping result, returning 18714 1726853446.17982: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' [02083763-bbaf-e784-4f7d-0000000004f6] 18714 1726853446.17986: sending task result for task 02083763-bbaf-e784-4f7d-0000000004f6 18714 1726853446.18069: done sending task result for task 02083763-bbaf-e784-4f7d-0000000004f6 18714 1726853446.18073: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18714 1726853446.18149: no more pending results, returning what we have 18714 1726853446.18153: results queue empty 18714 1726853446.18154: checking for any_errors_fatal 18714 1726853446.18164: done checking for any_errors_fatal 18714 1726853446.18165: checking for max_fail_percentage 18714 1726853446.18166: done checking for max_fail_percentage 18714 1726853446.18168: checking to see if all hosts have failed and the running result is not ok 18714 1726853446.18168: done checking to see if all hosts have failed 18714 1726853446.18169: getting the remaining hosts for this loop 18714 1726853446.18172: done getting the remaining hosts for this loop 18714 1726853446.18175: getting the next task for host managed_node1 18714 1726853446.18185: done getting next task for host managed_node1 18714 1726853446.18188: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18714 1726853446.18190: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853446.18195: getting variables 18714 1726853446.18197: in VariableManager get_vars() 18714 1726853446.18229: Calling all_inventory to load vars for managed_node1 18714 1726853446.18232: Calling groups_inventory to load vars for managed_node1 18714 1726853446.18236: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.18248: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.18252: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.18255: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.21720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.23050: done with get_vars() 18714 1726853446.23075: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 13:30:46 -0400 (0:00:00.095) 0:00:42.614 ****** 18714 1726853446.23146: entering _queue_task() for managed_node1/include_tasks 18714 1726853446.23409: worker is 1 (out of 1 available) 18714 1726853446.23423: exiting _queue_task() for managed_node1/include_tasks 18714 1726853446.23435: done queuing things up, now waiting for results queue to drain 18714 1726853446.23437: waiting for pending results... 18714 1726853446.23617: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 18714 1726853446.23716: in run() - task 02083763-bbaf-e784-4f7d-000000000075 18714 1726853446.23721: variable 'ansible_search_path' from source: unknown 18714 1726853446.23761: calling self._execute() 18714 1726853446.23833: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.23838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.23849: variable 'omit' from source: magic vars 18714 1726853446.24295: variable 'ansible_distribution_major_version' from source: facts 18714 1726853446.24302: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853446.24309: _execute() done 18714 1726853446.24312: dumping result to json 18714 1726853446.24314: done dumping result, returning 18714 1726853446.24323: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [02083763-bbaf-e784-4f7d-000000000075] 18714 1726853446.24332: sending task result for task 02083763-bbaf-e784-4f7d-000000000075 18714 1726853446.24576: no more pending results, returning what we have 18714 1726853446.24582: in VariableManager get_vars() 18714 1726853446.24620: Calling all_inventory to load vars for managed_node1 18714 1726853446.24628: Calling groups_inventory to load vars for managed_node1 18714 1726853446.24634: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.24654: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.24658: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.24662: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.25284: done sending task result for task 02083763-bbaf-e784-4f7d-000000000075 18714 1726853446.25287: WORKER PROCESS EXITING 18714 1726853446.27804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.29337: done with get_vars() 18714 1726853446.29360: variable 'ansible_search_path' from source: unknown 18714 1726853446.29379: we have included files to process 18714 1726853446.29380: generating all_blocks data 18714 1726853446.29383: done generating all_blocks data 18714 1726853446.29389: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18714 1726853446.29391: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18714 1726853446.29394: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18714 1726853446.29605: in VariableManager get_vars() 18714 1726853446.29621: done with get_vars() 18714 1726853446.29955: done processing included file 18714 1726853446.29958: iterating over new_blocks loaded from include file 18714 1726853446.29959: in VariableManager get_vars() 18714 1726853446.29970: done with get_vars() 18714 1726853446.29975: filtering new block on tags 18714 1726853446.29998: done filtering new block on tags 18714 1726853446.30001: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 18714 1726853446.30006: extending task lists for all hosts with included blocks 18714 1726853446.30557: done extending task lists 18714 1726853446.30558: done processing included files 18714 1726853446.30559: results queue empty 18714 1726853446.30560: checking for any_errors_fatal 18714 1726853446.30563: done checking for any_errors_fatal 18714 1726853446.30564: checking for max_fail_percentage 18714 1726853446.30565: done checking for max_fail_percentage 18714 1726853446.30566: checking to see if all hosts have failed and the running result is not ok 18714 1726853446.30567: done checking to see if all hosts have failed 18714 1726853446.30567: getting the remaining hosts for this loop 18714 1726853446.30568: done getting the remaining hosts for this loop 18714 1726853446.30676: getting the next task for host managed_node1 18714 1726853446.30681: done getting next task for host managed_node1 18714 1726853446.30684: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18714 1726853446.30687: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853446.30689: getting variables 18714 1726853446.30690: in VariableManager get_vars() 18714 1726853446.30698: Calling all_inventory to load vars for managed_node1 18714 1726853446.30700: Calling groups_inventory to load vars for managed_node1 18714 1726853446.30702: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.30707: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.30709: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.30712: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.33422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.36560: done with get_vars() 18714 1726853446.36715: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:30:46 -0400 (0:00:00.136) 0:00:42.751 ****** 18714 1726853446.36810: entering _queue_task() for managed_node1/include_tasks 18714 1726853446.37186: worker is 1 (out of 1 available) 18714 1726853446.37196: exiting _queue_task() for managed_node1/include_tasks 18714 1726853446.37205: done queuing things up, now waiting for results queue to drain 18714 1726853446.37206: waiting for pending results... 18714 1726853446.37476: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 18714 1726853446.37599: in run() - task 02083763-bbaf-e784-4f7d-00000000053c 18714 1726853446.37618: variable 'ansible_search_path' from source: unknown 18714 1726853446.37626: variable 'ansible_search_path' from source: unknown 18714 1726853446.37667: calling self._execute() 18714 1726853446.37788: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.37791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.37798: variable 'omit' from source: magic vars 18714 1726853446.38187: variable 'ansible_distribution_major_version' from source: facts 18714 1726853446.38222: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853446.38226: _execute() done 18714 1726853446.38229: dumping result to json 18714 1726853446.38231: done dumping result, returning 18714 1726853446.38276: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-e784-4f7d-00000000053c] 18714 1726853446.38279: sending task result for task 02083763-bbaf-e784-4f7d-00000000053c 18714 1726853446.38601: done sending task result for task 02083763-bbaf-e784-4f7d-00000000053c 18714 1726853446.38604: WORKER PROCESS EXITING 18714 1726853446.38632: no more pending results, returning what we have 18714 1726853446.38637: in VariableManager get_vars() 18714 1726853446.38673: Calling all_inventory to load vars for managed_node1 18714 1726853446.38676: Calling groups_inventory to load vars for managed_node1 18714 1726853446.38679: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.38690: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.38692: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.38695: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.40223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.41894: done with get_vars() 18714 1726853446.41915: variable 'ansible_search_path' from source: unknown 18714 1726853446.41917: variable 'ansible_search_path' from source: unknown 18714 1726853446.41962: we have included files to process 18714 1726853446.41964: generating all_blocks data 18714 1726853446.41965: done generating all_blocks data 18714 1726853446.41966: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18714 1726853446.41967: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18714 1726853446.41970: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18714 1726853446.42167: done processing included file 18714 1726853446.42169: iterating over new_blocks loaded from include file 18714 1726853446.42172: in VariableManager get_vars() 18714 1726853446.42186: done with get_vars() 18714 1726853446.42187: filtering new block on tags 18714 1726853446.42202: done filtering new block on tags 18714 1726853446.42204: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 18714 1726853446.42210: extending task lists for all hosts with included blocks 18714 1726853446.42316: done extending task lists 18714 1726853446.42317: done processing included files 18714 1726853446.42318: results queue empty 18714 1726853446.42319: checking for any_errors_fatal 18714 1726853446.42321: done checking for any_errors_fatal 18714 1726853446.42322: checking for max_fail_percentage 18714 1726853446.42323: done checking for max_fail_percentage 18714 1726853446.42324: checking to see if all hosts have failed and the running result is not ok 18714 1726853446.42324: done checking to see if all hosts have failed 18714 1726853446.42325: getting the remaining hosts for this loop 18714 1726853446.42326: done getting the remaining hosts for this loop 18714 1726853446.42328: getting the next task for host managed_node1 18714 1726853446.42332: done getting next task for host managed_node1 18714 1726853446.42334: ^ task is: TASK: Get stat for interface {{ interface }} 18714 1726853446.42337: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853446.42339: getting variables 18714 1726853446.42339: in VariableManager get_vars() 18714 1726853446.42347: Calling all_inventory to load vars for managed_node1 18714 1726853446.42349: Calling groups_inventory to load vars for managed_node1 18714 1726853446.42354: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.42363: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.42366: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.42368: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.43585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.45207: done with get_vars() 18714 1726853446.45227: done getting variables 18714 1726853446.45386: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:30:46 -0400 (0:00:00.086) 0:00:42.837 ****** 18714 1726853446.45411: entering _queue_task() for managed_node1/stat 18714 1726853446.45664: worker is 1 (out of 1 available) 18714 1726853446.45684: exiting _queue_task() for managed_node1/stat 18714 1726853446.45695: done queuing things up, now waiting for results queue to drain 18714 1726853446.45696: waiting for pending results... 18714 1726853446.45888: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 18714 1726853446.45963: in run() - task 02083763-bbaf-e784-4f7d-000000000554 18714 1726853446.45974: variable 'ansible_search_path' from source: unknown 18714 1726853446.45978: variable 'ansible_search_path' from source: unknown 18714 1726853446.46006: calling self._execute() 18714 1726853446.46081: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.46084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.46094: variable 'omit' from source: magic vars 18714 1726853446.46375: variable 'ansible_distribution_major_version' from source: facts 18714 1726853446.46382: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853446.46388: variable 'omit' from source: magic vars 18714 1726853446.46423: variable 'omit' from source: magic vars 18714 1726853446.46492: variable 'interface' from source: set_fact 18714 1726853446.46505: variable 'omit' from source: magic vars 18714 1726853446.46536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853446.46565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853446.46587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853446.46600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853446.46611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853446.46636: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853446.46639: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.46642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.46715: Set connection var ansible_shell_executable to /bin/sh 18714 1726853446.46721: Set connection var ansible_timeout to 10 18714 1726853446.46726: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853446.46732: Set connection var ansible_connection to ssh 18714 1726853446.46736: Set connection var ansible_shell_type to sh 18714 1726853446.46741: Set connection var ansible_pipelining to False 18714 1726853446.46759: variable 'ansible_shell_executable' from source: unknown 18714 1726853446.46762: variable 'ansible_connection' from source: unknown 18714 1726853446.46765: variable 'ansible_module_compression' from source: unknown 18714 1726853446.46767: variable 'ansible_shell_type' from source: unknown 18714 1726853446.46769: variable 'ansible_shell_executable' from source: unknown 18714 1726853446.46773: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.46776: variable 'ansible_pipelining' from source: unknown 18714 1726853446.46778: variable 'ansible_timeout' from source: unknown 18714 1726853446.46784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.46926: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18714 1726853446.46936: variable 'omit' from source: magic vars 18714 1726853446.46941: starting attempt loop 18714 1726853446.46944: running the handler 18714 1726853446.46958: _low_level_execute_command(): starting 18714 1726853446.46966: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853446.47462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853446.47496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.47500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853446.47502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.47550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853446.47558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853446.47561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853446.47610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853446.49287: stdout chunk (state=3): >>>/root <<< 18714 1726853446.49385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853446.49407: stderr chunk (state=3): >>><<< 18714 1726853446.49410: stdout chunk (state=3): >>><<< 18714 1726853446.49428: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853446.49441: _low_level_execute_command(): starting 18714 1726853446.49446: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821 `" && echo ansible-tmp-1726853446.4942765-20694-116710233485821="` echo /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821 `" ) && sleep 0' 18714 1726853446.49840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853446.49875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853446.49878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853446.49888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853446.49890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853446.49892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.49934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853446.49937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853446.49985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853446.51887: stdout chunk (state=3): >>>ansible-tmp-1726853446.4942765-20694-116710233485821=/root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821 <<< 18714 1726853446.52176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853446.52179: stdout chunk (state=3): >>><<< 18714 1726853446.52182: stderr chunk (state=3): >>><<< 18714 1726853446.52185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853446.4942765-20694-116710233485821=/root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853446.52187: variable 'ansible_module_compression' from source: unknown 18714 1726853446.52190: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18714 1726853446.52193: variable 'ansible_facts' from source: unknown 18714 1726853446.52276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py 18714 1726853446.52390: Sending initial data 18714 1726853446.52394: Sent initial data (153 bytes) 18714 1726853446.53047: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.53111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853446.53136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853446.53206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853446.54740: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18714 1726853446.54750: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853446.54783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853446.54823: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp47mr4gft /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py <<< 18714 1726853446.54831: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py" <<< 18714 1726853446.54864: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp47mr4gft" to remote "/root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py" <<< 18714 1726853446.55391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853446.55414: stderr chunk (state=3): >>><<< 18714 1726853446.55417: stdout chunk (state=3): >>><<< 18714 1726853446.55434: done transferring module to remote 18714 1726853446.55441: _low_level_execute_command(): starting 18714 1726853446.55447: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/ /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py && sleep 0' 18714 1726853446.55860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853446.55863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853446.55865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853446.55867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853446.55875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.55942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853446.55946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853446.56026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853446.57727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853446.57760: stderr chunk (state=3): >>><<< 18714 1726853446.57764: stdout chunk (state=3): >>><<< 18714 1726853446.57777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853446.57780: _low_level_execute_command(): starting 18714 1726853446.57786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/AnsiballZ_stat.py && sleep 0' 18714 1726853446.58340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853446.58344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853446.58346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.58348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853446.58350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853446.58355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853446.58448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853446.58475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853446.73679: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18714 1726853446.75066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853446.75070: stderr chunk (state=3): >>><<< 18714 1726853446.75074: stdout chunk (state=3): >>><<< 18714 1726853446.75166: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853446.75197: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853446.75214: _low_level_execute_command(): starting 18714 1726853446.75217: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853446.4942765-20694-116710233485821/ > /dev/null 2>&1 && sleep 0' 18714 1726853446.76899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853446.77249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853446.77298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853446.79155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853446.79209: stdout chunk (state=3): >>><<< 18714 1726853446.79212: stderr chunk (state=3): >>><<< 18714 1726853446.79215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853446.79217: handler run complete 18714 1726853446.79220: attempt loop complete, returning result 18714 1726853446.79221: _execute() done 18714 1726853446.79223: dumping result to json 18714 1726853446.79225: done dumping result, returning 18714 1726853446.79227: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [02083763-bbaf-e784-4f7d-000000000554] 18714 1726853446.79232: sending task result for task 02083763-bbaf-e784-4f7d-000000000554 18714 1726853446.79576: done sending task result for task 02083763-bbaf-e784-4f7d-000000000554 18714 1726853446.79579: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18714 1726853446.79636: no more pending results, returning what we have 18714 1726853446.79641: results queue empty 18714 1726853446.79642: checking for any_errors_fatal 18714 1726853446.79643: done checking for any_errors_fatal 18714 1726853446.79644: checking for max_fail_percentage 18714 1726853446.79646: done checking for max_fail_percentage 18714 1726853446.79647: checking to see if all hosts have failed and the running result is not ok 18714 1726853446.79648: done checking to see if all hosts have failed 18714 1726853446.79648: getting the remaining hosts for this loop 18714 1726853446.79650: done getting the remaining hosts for this loop 18714 1726853446.79653: getting the next task for host managed_node1 18714 1726853446.79660: done getting next task for host managed_node1 18714 1726853446.79662: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 18714 1726853446.79665: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853446.79669: getting variables 18714 1726853446.79673: in VariableManager get_vars() 18714 1726853446.79705: Calling all_inventory to load vars for managed_node1 18714 1726853446.79708: Calling groups_inventory to load vars for managed_node1 18714 1726853446.79712: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.79724: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.79727: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.79730: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.82841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.85968: done with get_vars() 18714 1726853446.85998: done getting variables 18714 1726853446.86079: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18714 1726853446.86265: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:30:46 -0400 (0:00:00.408) 0:00:43.246 ****** 18714 1726853446.86304: entering _queue_task() for managed_node1/assert 18714 1726853446.86886: worker is 1 (out of 1 available) 18714 1726853446.86897: exiting _queue_task() for managed_node1/assert 18714 1726853446.86907: done queuing things up, now waiting for results queue to drain 18714 1726853446.86908: waiting for pending results... 18714 1726853446.87008: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' 18714 1726853446.87237: in run() - task 02083763-bbaf-e784-4f7d-00000000053d 18714 1726853446.87245: variable 'ansible_search_path' from source: unknown 18714 1726853446.87249: variable 'ansible_search_path' from source: unknown 18714 1726853446.87251: calling self._execute() 18714 1726853446.87903: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.87906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.87909: variable 'omit' from source: magic vars 18714 1726853446.88912: variable 'ansible_distribution_major_version' from source: facts 18714 1726853446.88993: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853446.89011: variable 'omit' from source: magic vars 18714 1726853446.89058: variable 'omit' from source: magic vars 18714 1726853446.89446: variable 'interface' from source: set_fact 18714 1726853446.89450: variable 'omit' from source: magic vars 18714 1726853446.89452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853446.89454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853446.89620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853446.89687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853446.89705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853446.89806: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853446.89816: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.89824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.90046: Set connection var ansible_shell_executable to /bin/sh 18714 1726853446.90060: Set connection var ansible_timeout to 10 18714 1726853446.90073: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853446.90105: Set connection var ansible_connection to ssh 18714 1726853446.90117: Set connection var ansible_shell_type to sh 18714 1726853446.90312: Set connection var ansible_pipelining to False 18714 1726853446.90316: variable 'ansible_shell_executable' from source: unknown 18714 1726853446.90318: variable 'ansible_connection' from source: unknown 18714 1726853446.90321: variable 'ansible_module_compression' from source: unknown 18714 1726853446.90323: variable 'ansible_shell_type' from source: unknown 18714 1726853446.90421: variable 'ansible_shell_executable' from source: unknown 18714 1726853446.90424: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853446.90427: variable 'ansible_pipelining' from source: unknown 18714 1726853446.90429: variable 'ansible_timeout' from source: unknown 18714 1726853446.90431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853446.90496: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853446.90588: variable 'omit' from source: magic vars 18714 1726853446.90600: starting attempt loop 18714 1726853446.90644: running the handler 18714 1726853446.91076: variable 'interface_stat' from source: set_fact 18714 1726853446.91079: Evaluated conditional (not interface_stat.stat.exists): True 18714 1726853446.91086: handler run complete 18714 1726853446.91088: attempt loop complete, returning result 18714 1726853446.91090: _execute() done 18714 1726853446.91092: dumping result to json 18714 1726853446.91094: done dumping result, returning 18714 1726853446.91096: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' [02083763-bbaf-e784-4f7d-00000000053d] 18714 1726853446.91098: sending task result for task 02083763-bbaf-e784-4f7d-00000000053d 18714 1726853446.91166: done sending task result for task 02083763-bbaf-e784-4f7d-00000000053d 18714 1726853446.91169: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18714 1726853446.91238: no more pending results, returning what we have 18714 1726853446.91243: results queue empty 18714 1726853446.91245: checking for any_errors_fatal 18714 1726853446.91255: done checking for any_errors_fatal 18714 1726853446.91256: checking for max_fail_percentage 18714 1726853446.91259: done checking for max_fail_percentage 18714 1726853446.91260: checking to see if all hosts have failed and the running result is not ok 18714 1726853446.91260: done checking to see if all hosts have failed 18714 1726853446.91261: getting the remaining hosts for this loop 18714 1726853446.91263: done getting the remaining hosts for this loop 18714 1726853446.91267: getting the next task for host managed_node1 18714 1726853446.91282: done getting next task for host managed_node1 18714 1726853446.91285: ^ task is: TASK: meta (flush_handlers) 18714 1726853446.91287: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853446.91292: getting variables 18714 1726853446.91294: in VariableManager get_vars() 18714 1726853446.91325: Calling all_inventory to load vars for managed_node1 18714 1726853446.91329: Calling groups_inventory to load vars for managed_node1 18714 1726853446.91332: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.91344: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.91347: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.91350: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.93257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853446.96667: done with get_vars() 18714 1726853446.96794: done getting variables 18714 1726853446.96929: in VariableManager get_vars() 18714 1726853446.96939: Calling all_inventory to load vars for managed_node1 18714 1726853446.96941: Calling groups_inventory to load vars for managed_node1 18714 1726853446.96944: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853446.97011: Calling all_plugins_play to load vars for managed_node1 18714 1726853446.97014: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853446.97018: Calling groups_plugins_play to load vars for managed_node1 18714 1726853446.99573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853447.02702: done with get_vars() 18714 1726853447.02747: done queuing things up, now waiting for results queue to drain 18714 1726853447.02749: results queue empty 18714 1726853447.02750: checking for any_errors_fatal 18714 1726853447.02753: done checking for any_errors_fatal 18714 1726853447.02754: checking for max_fail_percentage 18714 1726853447.02755: done checking for max_fail_percentage 18714 1726853447.02756: checking to see if all hosts have failed and the running result is not ok 18714 1726853447.02757: done checking to see if all hosts have failed 18714 1726853447.02763: getting the remaining hosts for this loop 18714 1726853447.02764: done getting the remaining hosts for this loop 18714 1726853447.02767: getting the next task for host managed_node1 18714 1726853447.02772: done getting next task for host managed_node1 18714 1726853447.02774: ^ task is: TASK: meta (flush_handlers) 18714 1726853447.02776: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853447.02779: getting variables 18714 1726853447.02780: in VariableManager get_vars() 18714 1726853447.02789: Calling all_inventory to load vars for managed_node1 18714 1726853447.02792: Calling groups_inventory to load vars for managed_node1 18714 1726853447.02794: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853447.02800: Calling all_plugins_play to load vars for managed_node1 18714 1726853447.02802: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853447.02805: Calling groups_plugins_play to load vars for managed_node1 18714 1726853447.04110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853447.06975: done with get_vars() 18714 1726853447.07003: done getting variables 18714 1726853447.07064: in VariableManager get_vars() 18714 1726853447.07077: Calling all_inventory to load vars for managed_node1 18714 1726853447.07080: Calling groups_inventory to load vars for managed_node1 18714 1726853447.07083: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853447.07088: Calling all_plugins_play to load vars for managed_node1 18714 1726853447.07090: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853447.07095: Calling groups_plugins_play to load vars for managed_node1 18714 1726853447.09339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853447.13850: done with get_vars() 18714 1726853447.14362: done queuing things up, now waiting for results queue to drain 18714 1726853447.14364: results queue empty 18714 1726853447.14365: checking for any_errors_fatal 18714 1726853447.14367: done checking for any_errors_fatal 18714 1726853447.14367: checking for max_fail_percentage 18714 1726853447.14368: done checking for max_fail_percentage 18714 1726853447.14369: checking to see if all hosts have failed and the running result is not ok 18714 1726853447.14370: done checking to see if all hosts have failed 18714 1726853447.14372: getting the remaining hosts for this loop 18714 1726853447.14374: done getting the remaining hosts for this loop 18714 1726853447.14377: getting the next task for host managed_node1 18714 1726853447.14381: done getting next task for host managed_node1 18714 1726853447.14382: ^ task is: None 18714 1726853447.14383: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853447.14384: done queuing things up, now waiting for results queue to drain 18714 1726853447.14385: results queue empty 18714 1726853447.14386: checking for any_errors_fatal 18714 1726853447.14386: done checking for any_errors_fatal 18714 1726853447.14387: checking for max_fail_percentage 18714 1726853447.14388: done checking for max_fail_percentage 18714 1726853447.14389: checking to see if all hosts have failed and the running result is not ok 18714 1726853447.14389: done checking to see if all hosts have failed 18714 1726853447.14390: getting the next task for host managed_node1 18714 1726853447.14393: done getting next task for host managed_node1 18714 1726853447.14393: ^ task is: None 18714 1726853447.14395: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853447.14602: in VariableManager get_vars() 18714 1726853447.14620: done with get_vars() 18714 1726853447.14627: in VariableManager get_vars() 18714 1726853447.14637: done with get_vars() 18714 1726853447.14642: variable 'omit' from source: magic vars 18714 1726853447.14677: in VariableManager get_vars() 18714 1726853447.14884: done with get_vars() 18714 1726853447.14909: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18714 1726853447.15090: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18714 1726853447.15485: getting the remaining hosts for this loop 18714 1726853447.15487: done getting the remaining hosts for this loop 18714 1726853447.15489: getting the next task for host managed_node1 18714 1726853447.15492: done getting next task for host managed_node1 18714 1726853447.15494: ^ task is: TASK: Gathering Facts 18714 1726853447.15496: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853447.15498: getting variables 18714 1726853447.15498: in VariableManager get_vars() 18714 1726853447.15507: Calling all_inventory to load vars for managed_node1 18714 1726853447.15509: Calling groups_inventory to load vars for managed_node1 18714 1726853447.15512: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853447.15517: Calling all_plugins_play to load vars for managed_node1 18714 1726853447.15520: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853447.15523: Calling groups_plugins_play to load vars for managed_node1 18714 1726853447.18568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853447.22060: done with get_vars() 18714 1726853447.22693: done getting variables 18714 1726853447.22743: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 13:30:47 -0400 (0:00:00.364) 0:00:43.611 ****** 18714 1726853447.22977: entering _queue_task() for managed_node1/gather_facts 18714 1726853447.23518: worker is 1 (out of 1 available) 18714 1726853447.23531: exiting _queue_task() for managed_node1/gather_facts 18714 1726853447.23541: done queuing things up, now waiting for results queue to drain 18714 1726853447.23542: waiting for pending results... 18714 1726853447.24189: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18714 1726853447.24851: in run() - task 02083763-bbaf-e784-4f7d-00000000056d 18714 1726853447.24858: variable 'ansible_search_path' from source: unknown 18714 1726853447.24862: calling self._execute() 18714 1726853447.25069: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853447.25085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853447.25102: variable 'omit' from source: magic vars 18714 1726853447.26147: variable 'ansible_distribution_major_version' from source: facts 18714 1726853447.26781: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853447.26785: variable 'omit' from source: magic vars 18714 1726853447.26788: variable 'omit' from source: magic vars 18714 1726853447.26792: variable 'omit' from source: magic vars 18714 1726853447.26794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853447.26797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853447.26799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853447.27096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853447.27276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853447.27280: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853447.27283: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853447.27285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853447.27764: Set connection var ansible_shell_executable to /bin/sh 18714 1726853447.27768: Set connection var ansible_timeout to 10 18714 1726853447.27773: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853447.27776: Set connection var ansible_connection to ssh 18714 1726853447.27778: Set connection var ansible_shell_type to sh 18714 1726853447.27780: Set connection var ansible_pipelining to False 18714 1726853447.27980: variable 'ansible_shell_executable' from source: unknown 18714 1726853447.27984: variable 'ansible_connection' from source: unknown 18714 1726853447.27986: variable 'ansible_module_compression' from source: unknown 18714 1726853447.27988: variable 'ansible_shell_type' from source: unknown 18714 1726853447.27991: variable 'ansible_shell_executable' from source: unknown 18714 1726853447.27993: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853447.27995: variable 'ansible_pipelining' from source: unknown 18714 1726853447.27997: variable 'ansible_timeout' from source: unknown 18714 1726853447.27999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853447.28535: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853447.28558: variable 'omit' from source: magic vars 18714 1726853447.28642: starting attempt loop 18714 1726853447.28650: running the handler 18714 1726853447.28878: variable 'ansible_facts' from source: unknown 18714 1726853447.28886: _low_level_execute_command(): starting 18714 1726853447.28892: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853447.30196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853447.30416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853447.30420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853447.30465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853447.30634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853447.32319: stdout chunk (state=3): >>>/root <<< 18714 1726853447.32374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853447.32414: stderr chunk (state=3): >>><<< 18714 1726853447.32425: stdout chunk (state=3): >>><<< 18714 1726853447.32580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853447.32583: _low_level_execute_command(): starting 18714 1726853447.32586: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796 `" && echo ansible-tmp-1726853447.3249562-20731-45279805041796="` echo /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796 `" ) && sleep 0' 18714 1726853447.33863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853447.33879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853447.33963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853447.33982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853447.34032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853447.36186: stdout chunk (state=3): >>>ansible-tmp-1726853447.3249562-20731-45279805041796=/root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796 <<< 18714 1726853447.36190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853447.36192: stdout chunk (state=3): >>><<< 18714 1726853447.36195: stderr chunk (state=3): >>><<< 18714 1726853447.36587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853447.3249562-20731-45279805041796=/root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853447.36591: variable 'ansible_module_compression' from source: unknown 18714 1726853447.36593: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18714 1726853447.36595: variable 'ansible_facts' from source: unknown 18714 1726853447.37290: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py 18714 1726853447.37693: Sending initial data 18714 1726853447.37701: Sent initial data (153 bytes) 18714 1726853447.38697: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853447.38738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853447.38912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853447.38994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853447.39044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853447.39081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853447.40651: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853447.40893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853447.41297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4x80xlyw /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py <<< 18714 1726853447.41308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py" <<< 18714 1726853447.41533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4x80xlyw" to remote "/root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py" <<< 18714 1726853447.45307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853447.45577: stderr chunk (state=3): >>><<< 18714 1726853447.45581: stdout chunk (state=3): >>><<< 18714 1726853447.45583: done transferring module to remote 18714 1726853447.45588: _low_level_execute_command(): starting 18714 1726853447.45599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/ /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py && sleep 0' 18714 1726853447.47001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853447.47129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853447.47148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853447.47162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853447.47488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853447.49351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853447.49391: stdout chunk (state=3): >>><<< 18714 1726853447.49686: stderr chunk (state=3): >>><<< 18714 1726853447.49691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853447.49698: _low_level_execute_command(): starting 18714 1726853447.49701: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/AnsiballZ_setup.py && sleep 0' 18714 1726853447.50807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853447.50887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853447.51009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853447.51052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853447.51085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853447.51164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.14321: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.2578125, "5m": 0.32421875, "15m": 0.16650390625}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "47", "epoch": "1726853447", "epoch_int": "1726853447", "date": "2024-09-20", "time": "13:30:47", "iso8601_micro": "2024-09-20T17:30:47.784134Z", "iso8601": "2024-09-20T17:30:47Z", "iso8601_basic": "20240920T133047784134", "iso8601_basic_short": "20240920T133047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "s<<< 18714 1726853448.14328: stdout chunk (state=3): >>>sh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 614, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794979840, "block_size": 4096, "block_total": 65519099, "block_available": 63914790, "block_used": 1604309, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18714 1726853448.16366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853448.16376: stderr chunk (state=3): >>><<< 18714 1726853448.16379: stdout chunk (state=3): >>><<< 18714 1726853448.16411: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.2578125, "5m": 0.32421875, "15m": 0.16650390625}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "30", "second": "47", "epoch": "1726853447", "epoch_int": "1726853447", "date": "2024-09-20", "time": "13:30:47", "iso8601_micro": "2024-09-20T17:30:47.784134Z", "iso8601": "2024-09-20T17:30:47Z", "iso8601_basic": "20240920T133047784134", "iso8601_basic_short": "20240920T133047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 614, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794979840, "block_size": 4096, "block_total": 65519099, "block_available": 63914790, "block_used": 1604309, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853448.16647: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853448.16666: _low_level_execute_command(): starting 18714 1726853448.16670: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853447.3249562-20731-45279805041796/ > /dev/null 2>&1 && sleep 0' 18714 1726853448.17214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.17258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853448.17307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.17355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.17480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.19345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.19399: stderr chunk (state=3): >>><<< 18714 1726853448.19589: stdout chunk (state=3): >>><<< 18714 1726853448.19593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853448.19596: handler run complete 18714 1726853448.19641: variable 'ansible_facts' from source: unknown 18714 1726853448.19758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.20487: variable 'ansible_facts' from source: unknown 18714 1726853448.20577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.20894: attempt loop complete, returning result 18714 1726853448.20902: _execute() done 18714 1726853448.20908: dumping result to json 18714 1726853448.20942: done dumping result, returning 18714 1726853448.20953: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-e784-4f7d-00000000056d] 18714 1726853448.20993: sending task result for task 02083763-bbaf-e784-4f7d-00000000056d 18714 1726853448.21520: done sending task result for task 02083763-bbaf-e784-4f7d-00000000056d 18714 1726853448.21523: WORKER PROCESS EXITING ok: [managed_node1] 18714 1726853448.21830: no more pending results, returning what we have 18714 1726853448.21833: results queue empty 18714 1726853448.21834: checking for any_errors_fatal 18714 1726853448.21836: done checking for any_errors_fatal 18714 1726853448.21837: checking for max_fail_percentage 18714 1726853448.21838: done checking for max_fail_percentage 18714 1726853448.21839: checking to see if all hosts have failed and the running result is not ok 18714 1726853448.21840: done checking to see if all hosts have failed 18714 1726853448.21841: getting the remaining hosts for this loop 18714 1726853448.21842: done getting the remaining hosts for this loop 18714 1726853448.21846: getting the next task for host managed_node1 18714 1726853448.21868: done getting next task for host managed_node1 18714 1726853448.21870: ^ task is: TASK: meta (flush_handlers) 18714 1726853448.21896: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853448.21917: getting variables 18714 1726853448.21919: in VariableManager get_vars() 18714 1726853448.21988: Calling all_inventory to load vars for managed_node1 18714 1726853448.22012: Calling groups_inventory to load vars for managed_node1 18714 1726853448.22017: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853448.22034: Calling all_plugins_play to load vars for managed_node1 18714 1726853448.22038: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853448.22041: Calling groups_plugins_play to load vars for managed_node1 18714 1726853448.30664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.32321: done with get_vars() 18714 1726853448.32354: done getting variables 18714 1726853448.32421: in VariableManager get_vars() 18714 1726853448.32431: Calling all_inventory to load vars for managed_node1 18714 1726853448.32433: Calling groups_inventory to load vars for managed_node1 18714 1726853448.32436: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853448.32441: Calling all_plugins_play to load vars for managed_node1 18714 1726853448.32443: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853448.32446: Calling groups_plugins_play to load vars for managed_node1 18714 1726853448.33637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.35478: done with get_vars() 18714 1726853448.35504: done queuing things up, now waiting for results queue to drain 18714 1726853448.35506: results queue empty 18714 1726853448.35507: checking for any_errors_fatal 18714 1726853448.35511: done checking for any_errors_fatal 18714 1726853448.35512: checking for max_fail_percentage 18714 1726853448.35513: done checking for max_fail_percentage 18714 1726853448.35514: checking to see if all hosts have failed and the running result is not ok 18714 1726853448.35515: done checking to see if all hosts have failed 18714 1726853448.35521: getting the remaining hosts for this loop 18714 1726853448.35522: done getting the remaining hosts for this loop 18714 1726853448.35525: getting the next task for host managed_node1 18714 1726853448.35529: done getting next task for host managed_node1 18714 1726853448.35531: ^ task is: TASK: Verify network state restored to default 18714 1726853448.35533: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853448.35535: getting variables 18714 1726853448.35536: in VariableManager get_vars() 18714 1726853448.35550: Calling all_inventory to load vars for managed_node1 18714 1726853448.35556: Calling groups_inventory to load vars for managed_node1 18714 1726853448.35558: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853448.35564: Calling all_plugins_play to load vars for managed_node1 18714 1726853448.35567: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853448.35569: Calling groups_plugins_play to load vars for managed_node1 18714 1726853448.36765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.38450: done with get_vars() 18714 1726853448.38483: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 13:30:48 -0400 (0:00:01.157) 0:00:44.769 ****** 18714 1726853448.38572: entering _queue_task() for managed_node1/include_tasks 18714 1726853448.38966: worker is 1 (out of 1 available) 18714 1726853448.39180: exiting _queue_task() for managed_node1/include_tasks 18714 1726853448.39190: done queuing things up, now waiting for results queue to drain 18714 1726853448.39191: waiting for pending results... 18714 1726853448.39321: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 18714 1726853448.39420: in run() - task 02083763-bbaf-e784-4f7d-000000000078 18714 1726853448.39446: variable 'ansible_search_path' from source: unknown 18714 1726853448.39491: calling self._execute() 18714 1726853448.39605: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.39617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.39643: variable 'omit' from source: magic vars 18714 1726853448.40088: variable 'ansible_distribution_major_version' from source: facts 18714 1726853448.40106: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853448.40116: _execute() done 18714 1726853448.40124: dumping result to json 18714 1726853448.40132: done dumping result, returning 18714 1726853448.40143: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-e784-4f7d-000000000078] 18714 1726853448.40155: sending task result for task 02083763-bbaf-e784-4f7d-000000000078 18714 1726853448.40403: no more pending results, returning what we have 18714 1726853448.40409: in VariableManager get_vars() 18714 1726853448.40446: Calling all_inventory to load vars for managed_node1 18714 1726853448.40449: Calling groups_inventory to load vars for managed_node1 18714 1726853448.40456: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853448.40473: Calling all_plugins_play to load vars for managed_node1 18714 1726853448.40476: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853448.40480: Calling groups_plugins_play to load vars for managed_node1 18714 1726853448.41114: done sending task result for task 02083763-bbaf-e784-4f7d-000000000078 18714 1726853448.41125: WORKER PROCESS EXITING 18714 1726853448.42299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.44050: done with get_vars() 18714 1726853448.44084: variable 'ansible_search_path' from source: unknown 18714 1726853448.44101: we have included files to process 18714 1726853448.44102: generating all_blocks data 18714 1726853448.44103: done generating all_blocks data 18714 1726853448.44104: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18714 1726853448.44105: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18714 1726853448.44107: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18714 1726853448.44563: done processing included file 18714 1726853448.44565: iterating over new_blocks loaded from include file 18714 1726853448.44567: in VariableManager get_vars() 18714 1726853448.44587: done with get_vars() 18714 1726853448.44589: filtering new block on tags 18714 1726853448.44607: done filtering new block on tags 18714 1726853448.44609: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 18714 1726853448.44615: extending task lists for all hosts with included blocks 18714 1726853448.44647: done extending task lists 18714 1726853448.44648: done processing included files 18714 1726853448.44649: results queue empty 18714 1726853448.44650: checking for any_errors_fatal 18714 1726853448.44654: done checking for any_errors_fatal 18714 1726853448.44654: checking for max_fail_percentage 18714 1726853448.44656: done checking for max_fail_percentage 18714 1726853448.44656: checking to see if all hosts have failed and the running result is not ok 18714 1726853448.44657: done checking to see if all hosts have failed 18714 1726853448.44658: getting the remaining hosts for this loop 18714 1726853448.44659: done getting the remaining hosts for this loop 18714 1726853448.44661: getting the next task for host managed_node1 18714 1726853448.44664: done getting next task for host managed_node1 18714 1726853448.44666: ^ task is: TASK: Check routes and DNS 18714 1726853448.44668: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853448.44669: getting variables 18714 1726853448.44670: in VariableManager get_vars() 18714 1726853448.44680: Calling all_inventory to load vars for managed_node1 18714 1726853448.44682: Calling groups_inventory to load vars for managed_node1 18714 1726853448.44684: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853448.44695: Calling all_plugins_play to load vars for managed_node1 18714 1726853448.44697: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853448.44700: Calling groups_plugins_play to load vars for managed_node1 18714 1726853448.47114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.50275: done with get_vars() 18714 1726853448.50300: done getting variables 18714 1726853448.50343: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:30:48 -0400 (0:00:00.117) 0:00:44.887 ****** 18714 1726853448.50376: entering _queue_task() for managed_node1/shell 18714 1726853448.50731: worker is 1 (out of 1 available) 18714 1726853448.50743: exiting _queue_task() for managed_node1/shell 18714 1726853448.50754: done queuing things up, now waiting for results queue to drain 18714 1726853448.50755: waiting for pending results... 18714 1726853448.51188: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 18714 1726853448.51498: in run() - task 02083763-bbaf-e784-4f7d-00000000057e 18714 1726853448.51520: variable 'ansible_search_path' from source: unknown 18714 1726853448.51557: variable 'ansible_search_path' from source: unknown 18714 1726853448.51591: calling self._execute() 18714 1726853448.51694: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.51706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.51745: variable 'omit' from source: magic vars 18714 1726853448.52175: variable 'ansible_distribution_major_version' from source: facts 18714 1726853448.52192: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853448.52207: variable 'omit' from source: magic vars 18714 1726853448.52286: variable 'omit' from source: magic vars 18714 1726853448.52291: variable 'omit' from source: magic vars 18714 1726853448.52336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853448.52378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853448.52407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853448.52433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853448.52449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853448.52503: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853448.52507: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.52510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.52704: Set connection var ansible_shell_executable to /bin/sh 18714 1726853448.52708: Set connection var ansible_timeout to 10 18714 1726853448.52828: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853448.52832: Set connection var ansible_connection to ssh 18714 1726853448.52834: Set connection var ansible_shell_type to sh 18714 1726853448.52836: Set connection var ansible_pipelining to False 18714 1726853448.52838: variable 'ansible_shell_executable' from source: unknown 18714 1726853448.52840: variable 'ansible_connection' from source: unknown 18714 1726853448.52843: variable 'ansible_module_compression' from source: unknown 18714 1726853448.52846: variable 'ansible_shell_type' from source: unknown 18714 1726853448.52849: variable 'ansible_shell_executable' from source: unknown 18714 1726853448.52853: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.52856: variable 'ansible_pipelining' from source: unknown 18714 1726853448.52859: variable 'ansible_timeout' from source: unknown 18714 1726853448.52862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.53292: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853448.53301: variable 'omit' from source: magic vars 18714 1726853448.53305: starting attempt loop 18714 1726853448.53311: running the handler 18714 1726853448.53317: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853448.53321: _low_level_execute_command(): starting 18714 1726853448.53416: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853448.54499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.54548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.54577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.54647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.56343: stdout chunk (state=3): >>>/root <<< 18714 1726853448.56504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.56507: stdout chunk (state=3): >>><<< 18714 1726853448.56509: stderr chunk (state=3): >>><<< 18714 1726853448.56636: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853448.56639: _low_level_execute_command(): starting 18714 1726853448.56643: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866 `" && echo ansible-tmp-1726853448.5653589-20777-235745962993866="` echo /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866 `" ) && sleep 0' 18714 1726853448.57248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.57495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.57543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.59434: stdout chunk (state=3): >>>ansible-tmp-1726853448.5653589-20777-235745962993866=/root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866 <<< 18714 1726853448.59583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.59587: stdout chunk (state=3): >>><<< 18714 1726853448.59591: stderr chunk (state=3): >>><<< 18714 1726853448.59781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853448.5653589-20777-235745962993866=/root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853448.59784: variable 'ansible_module_compression' from source: unknown 18714 1726853448.59786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853448.59788: variable 'ansible_facts' from source: unknown 18714 1726853448.59819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py 18714 1726853448.59996: Sending initial data 18714 1726853448.59999: Sent initial data (156 bytes) 18714 1726853448.60563: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853448.60574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853448.60585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853448.60600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853448.60690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.60701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853448.60713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.60732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.60800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.62627: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853448.62782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4sw8jyie" to remote "/root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py" <<< 18714 1726853448.62787: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmp4sw8jyie /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py <<< 18714 1726853448.63891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.63919: stdout chunk (state=3): >>><<< 18714 1726853448.63985: stderr chunk (state=3): >>><<< 18714 1726853448.63988: done transferring module to remote 18714 1726853448.63991: _low_level_execute_command(): starting 18714 1726853448.63996: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/ /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py && sleep 0' 18714 1726853448.64856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853448.64869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853448.64888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853448.64916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853448.64968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.65041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853448.65104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.65107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.65418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.67014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.67024: stdout chunk (state=3): >>><<< 18714 1726853448.67026: stderr chunk (state=3): >>><<< 18714 1726853448.67030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853448.67032: _low_level_execute_command(): starting 18714 1726853448.67035: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/AnsiballZ_command.py && sleep 0' 18714 1726853448.67416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853448.67420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853448.67422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 18714 1726853448.67424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853448.67426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.67477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.67484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.67526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.83777: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3000sec preferred_lft 3000sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:30:48.827398", "end": "2024-09-20 13:30:48.836011", "delta": "0:00:00.008613", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853448.85199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.85257: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 18714 1726853448.85268: stdout chunk (state=3): >>><<< 18714 1726853448.85285: stderr chunk (state=3): >>><<< 18714 1726853448.85446: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3000sec preferred_lft 3000sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:30:48.827398", "end": "2024-09-20 13:30:48.836011", "delta": "0:00:00.008613", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853448.85459: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853448.85462: _low_level_execute_command(): starting 18714 1726853448.85465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853448.5653589-20777-235745962993866/ > /dev/null 2>&1 && sleep 0' 18714 1726853448.86238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853448.86251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853448.86323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.86393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.86426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.86540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.88323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.88350: stdout chunk (state=3): >>><<< 18714 1726853448.88356: stderr chunk (state=3): >>><<< 18714 1726853448.88372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853448.88481: handler run complete 18714 1726853448.88484: Evaluated conditional (False): False 18714 1726853448.88487: attempt loop complete, returning result 18714 1726853448.88489: _execute() done 18714 1726853448.88491: dumping result to json 18714 1726853448.88493: done dumping result, returning 18714 1726853448.88495: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [02083763-bbaf-e784-4f7d-00000000057e] 18714 1726853448.88497: sending task result for task 02083763-bbaf-e784-4f7d-00000000057e ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008613", "end": "2024-09-20 13:30:48.836011", "rc": 0, "start": "2024-09-20 13:30:48.827398" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3000sec preferred_lft 3000sec inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 18714 1726853448.88685: no more pending results, returning what we have 18714 1726853448.88689: results queue empty 18714 1726853448.88690: checking for any_errors_fatal 18714 1726853448.88692: done checking for any_errors_fatal 18714 1726853448.88693: checking for max_fail_percentage 18714 1726853448.88697: done checking for max_fail_percentage 18714 1726853448.88698: checking to see if all hosts have failed and the running result is not ok 18714 1726853448.88698: done checking to see if all hosts have failed 18714 1726853448.88699: getting the remaining hosts for this loop 18714 1726853448.88700: done getting the remaining hosts for this loop 18714 1726853448.88704: getting the next task for host managed_node1 18714 1726853448.88712: done getting next task for host managed_node1 18714 1726853448.88715: ^ task is: TASK: Verify DNS and network connectivity 18714 1726853448.88717: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853448.88721: getting variables 18714 1726853448.88723: in VariableManager get_vars() 18714 1726853448.88755: Calling all_inventory to load vars for managed_node1 18714 1726853448.88758: Calling groups_inventory to load vars for managed_node1 18714 1726853448.88762: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853448.88977: Calling all_plugins_play to load vars for managed_node1 18714 1726853448.88987: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853448.89002: Calling groups_plugins_play to load vars for managed_node1 18714 1726853448.89616: done sending task result for task 02083763-bbaf-e784-4f7d-00000000057e 18714 1726853448.89620: WORKER PROCESS EXITING 18714 1726853448.90783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853448.92597: done with get_vars() 18714 1726853448.92620: done getting variables 18714 1726853448.92695: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:30:48 -0400 (0:00:00.423) 0:00:45.310 ****** 18714 1726853448.92727: entering _queue_task() for managed_node1/shell 18714 1726853448.93289: worker is 1 (out of 1 available) 18714 1726853448.93299: exiting _queue_task() for managed_node1/shell 18714 1726853448.93309: done queuing things up, now waiting for results queue to drain 18714 1726853448.93310: waiting for pending results... 18714 1726853448.93403: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 18714 1726853448.93532: in run() - task 02083763-bbaf-e784-4f7d-00000000057f 18714 1726853448.93566: variable 'ansible_search_path' from source: unknown 18714 1726853448.93576: variable 'ansible_search_path' from source: unknown 18714 1726853448.93660: calling self._execute() 18714 1726853448.93729: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.93740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.93768: variable 'omit' from source: magic vars 18714 1726853448.94187: variable 'ansible_distribution_major_version' from source: facts 18714 1726853448.94309: Evaluated conditional (ansible_distribution_major_version != '6'): True 18714 1726853448.94376: variable 'ansible_facts' from source: unknown 18714 1726853448.95213: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 18714 1726853448.95225: variable 'omit' from source: magic vars 18714 1726853448.95268: variable 'omit' from source: magic vars 18714 1726853448.95319: variable 'omit' from source: magic vars 18714 1726853448.95365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18714 1726853448.95416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18714 1726853448.95511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18714 1726853448.95514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853448.95517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18714 1726853448.95522: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18714 1726853448.95530: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.95537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.95657: Set connection var ansible_shell_executable to /bin/sh 18714 1726853448.95673: Set connection var ansible_timeout to 10 18714 1726853448.95684: Set connection var ansible_module_compression to ZIP_DEFLATED 18714 1726853448.95696: Set connection var ansible_connection to ssh 18714 1726853448.95705: Set connection var ansible_shell_type to sh 18714 1726853448.95714: Set connection var ansible_pipelining to False 18714 1726853448.95757: variable 'ansible_shell_executable' from source: unknown 18714 1726853448.95837: variable 'ansible_connection' from source: unknown 18714 1726853448.95843: variable 'ansible_module_compression' from source: unknown 18714 1726853448.95846: variable 'ansible_shell_type' from source: unknown 18714 1726853448.95847: variable 'ansible_shell_executable' from source: unknown 18714 1726853448.95849: variable 'ansible_host' from source: host vars for 'managed_node1' 18714 1726853448.95851: variable 'ansible_pipelining' from source: unknown 18714 1726853448.95855: variable 'ansible_timeout' from source: unknown 18714 1726853448.95857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18714 1726853448.95961: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853448.96055: variable 'omit' from source: magic vars 18714 1726853448.96059: starting attempt loop 18714 1726853448.96061: running the handler 18714 1726853448.96064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18714 1726853448.96066: _low_level_execute_command(): starting 18714 1726853448.96068: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18714 1726853448.96961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853448.96966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.97023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853448.97068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.97136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853448.98809: stdout chunk (state=3): >>>/root <<< 18714 1726853448.98961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853448.98964: stdout chunk (state=3): >>><<< 18714 1726853448.98966: stderr chunk (state=3): >>><<< 18714 1726853448.98989: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853448.99008: _low_level_execute_command(): starting 18714 1726853448.99092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557 `" && echo ansible-tmp-1726853448.9899545-20796-219710986959557="` echo /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557 `" ) && sleep 0' 18714 1726853448.99543: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853448.99556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18714 1726853448.99559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18714 1726853448.99561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 18714 1726853448.99564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853448.99612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853448.99617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853448.99657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853449.01510: stdout chunk (state=3): >>>ansible-tmp-1726853448.9899545-20796-219710986959557=/root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557 <<< 18714 1726853449.01658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853449.01661: stderr chunk (state=3): >>><<< 18714 1726853449.01664: stdout chunk (state=3): >>><<< 18714 1726853449.01686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853448.9899545-20796-219710986959557=/root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853449.01723: variable 'ansible_module_compression' from source: unknown 18714 1726853449.01792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18714ti53trv1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18714 1726853449.01820: variable 'ansible_facts' from source: unknown 18714 1726853449.01892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py 18714 1726853449.02106: Sending initial data 18714 1726853449.02110: Sent initial data (156 bytes) 18714 1726853449.02605: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853449.02614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853449.02685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.02722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853449.02733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853449.02758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853449.02815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853449.04364: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18714 1726853449.04394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18714 1726853449.04431: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18714ti53trv1/tmpobrn8h7n /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py <<< 18714 1726853449.04435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py" <<< 18714 1726853449.04499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18714ti53trv1/tmpobrn8h7n" to remote "/root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py" <<< 18714 1726853449.05082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853449.05125: stderr chunk (state=3): >>><<< 18714 1726853449.05128: stdout chunk (state=3): >>><<< 18714 1726853449.05148: done transferring module to remote 18714 1726853449.05157: _low_level_execute_command(): starting 18714 1726853449.05162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/ /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py && sleep 0' 18714 1726853449.05554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853449.05558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853449.05588: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.05595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853449.05597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.05644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853449.05647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853449.05696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853449.07578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853449.07598: stderr chunk (state=3): >>><<< 18714 1726853449.07601: stdout chunk (state=3): >>><<< 18714 1726853449.07677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853449.07681: _low_level_execute_command(): starting 18714 1726853449.07683: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/AnsiballZ_command.py && sleep 0' 18714 1726853449.08626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853449.08631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853449.08634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.08646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.08708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853449.08724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853449.08785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853449.69503: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 948 0 --:--:-- --:--:-- --:--:-- 950\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2580 0 --:--:-- --:--:-- --:--:-- 2575", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:30:49.236845", "end": "2024-09-20 13:30:49.693852", "delta": "0:00:00.457007", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18714 1726853449.71208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18714 1726853449.71212: stdout chunk (state=3): >>><<< 18714 1726853449.71218: stderr chunk (state=3): >>><<< 18714 1726853449.71240: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 948 0 --:--:-- --:--:-- --:--:-- 950\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2580 0 --:--:-- --:--:-- --:--:-- 2575", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:30:49.236845", "end": "2024-09-20 13:30:49.693852", "delta": "0:00:00.457007", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 18714 1726853449.71299: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18714 1726853449.71306: _low_level_execute_command(): starting 18714 1726853449.71312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853448.9899545-20796-219710986959557/ > /dev/null 2>&1 && sleep 0' 18714 1726853449.71965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18714 1726853449.72175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853449.72179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853449.72181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853449.72184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18714 1726853449.72186: stderr chunk (state=3): >>>debug2: match not found <<< 18714 1726853449.72188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.72190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18714 1726853449.72192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 18714 1726853449.72194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18714 1726853449.72196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18714 1726853449.72197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18714 1726853449.72199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18714 1726853449.72202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18714 1726853449.72204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18714 1726853449.72206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18714 1726853449.72208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18714 1726853449.72274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18714 1726853449.74167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18714 1726853449.74172: stdout chunk (state=3): >>><<< 18714 1726853449.74179: stderr chunk (state=3): >>><<< 18714 1726853449.74206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18714 1726853449.74213: handler run complete 18714 1726853449.74237: Evaluated conditional (False): False 18714 1726853449.74247: attempt loop complete, returning result 18714 1726853449.74250: _execute() done 18714 1726853449.74255: dumping result to json 18714 1726853449.74258: done dumping result, returning 18714 1726853449.74270: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [02083763-bbaf-e784-4f7d-00000000057f] 18714 1726853449.74275: sending task result for task 02083763-bbaf-e784-4f7d-00000000057f 18714 1726853449.74395: done sending task result for task 02083763-bbaf-e784-4f7d-00000000057f 18714 1726853449.74397: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.457007", "end": "2024-09-20 13:30:49.693852", "rc": 0, "start": "2024-09-20 13:30:49.236845" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 948 0 --:--:-- --:--:-- --:--:-- 950 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2580 0 --:--:-- --:--:-- --:--:-- 2575 18714 1726853449.74474: no more pending results, returning what we have 18714 1726853449.74483: results queue empty 18714 1726853449.74484: checking for any_errors_fatal 18714 1726853449.74491: done checking for any_errors_fatal 18714 1726853449.74492: checking for max_fail_percentage 18714 1726853449.74494: done checking for max_fail_percentage 18714 1726853449.74495: checking to see if all hosts have failed and the running result is not ok 18714 1726853449.74496: done checking to see if all hosts have failed 18714 1726853449.74496: getting the remaining hosts for this loop 18714 1726853449.74498: done getting the remaining hosts for this loop 18714 1726853449.74502: getting the next task for host managed_node1 18714 1726853449.74511: done getting next task for host managed_node1 18714 1726853449.74514: ^ task is: TASK: meta (flush_handlers) 18714 1726853449.74516: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853449.74520: getting variables 18714 1726853449.74522: in VariableManager get_vars() 18714 1726853449.74552: Calling all_inventory to load vars for managed_node1 18714 1726853449.74555: Calling groups_inventory to load vars for managed_node1 18714 1726853449.74559: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853449.74782: Calling all_plugins_play to load vars for managed_node1 18714 1726853449.74786: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853449.74790: Calling groups_plugins_play to load vars for managed_node1 18714 1726853449.77308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853449.80434: done with get_vars() 18714 1726853449.80456: done getting variables 18714 1726853449.80533: in VariableManager get_vars() 18714 1726853449.80543: Calling all_inventory to load vars for managed_node1 18714 1726853449.80545: Calling groups_inventory to load vars for managed_node1 18714 1726853449.80548: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853449.80552: Calling all_plugins_play to load vars for managed_node1 18714 1726853449.80555: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853449.80558: Calling groups_plugins_play to load vars for managed_node1 18714 1726853449.81849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853449.83566: done with get_vars() 18714 1726853449.83603: done queuing things up, now waiting for results queue to drain 18714 1726853449.83606: results queue empty 18714 1726853449.83607: checking for any_errors_fatal 18714 1726853449.83617: done checking for any_errors_fatal 18714 1726853449.83618: checking for max_fail_percentage 18714 1726853449.83620: done checking for max_fail_percentage 18714 1726853449.83620: checking to see if all hosts have failed and the running result is not ok 18714 1726853449.83621: done checking to see if all hosts have failed 18714 1726853449.83622: getting the remaining hosts for this loop 18714 1726853449.83623: done getting the remaining hosts for this loop 18714 1726853449.83626: getting the next task for host managed_node1 18714 1726853449.83630: done getting next task for host managed_node1 18714 1726853449.83632: ^ task is: TASK: meta (flush_handlers) 18714 1726853449.83633: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853449.83636: getting variables 18714 1726853449.83637: in VariableManager get_vars() 18714 1726853449.83654: Calling all_inventory to load vars for managed_node1 18714 1726853449.83656: Calling groups_inventory to load vars for managed_node1 18714 1726853449.83659: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853449.83677: Calling all_plugins_play to load vars for managed_node1 18714 1726853449.83680: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853449.83683: Calling groups_plugins_play to load vars for managed_node1 18714 1726853449.86317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853449.90469: done with get_vars() 18714 1726853449.90683: done getting variables 18714 1726853449.90735: in VariableManager get_vars() 18714 1726853449.90745: Calling all_inventory to load vars for managed_node1 18714 1726853449.90748: Calling groups_inventory to load vars for managed_node1 18714 1726853449.90750: Calling all_plugins_inventory to load vars for managed_node1 18714 1726853449.90755: Calling all_plugins_play to load vars for managed_node1 18714 1726853449.90757: Calling groups_plugins_inventory to load vars for managed_node1 18714 1726853449.90759: Calling groups_plugins_play to load vars for managed_node1 18714 1726853449.93046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18714 1726853449.94769: done with get_vars() 18714 1726853449.94807: done queuing things up, now waiting for results queue to drain 18714 1726853449.94809: results queue empty 18714 1726853449.94810: checking for any_errors_fatal 18714 1726853449.94812: done checking for any_errors_fatal 18714 1726853449.94812: checking for max_fail_percentage 18714 1726853449.94813: done checking for max_fail_percentage 18714 1726853449.94814: checking to see if all hosts have failed and the running result is not ok 18714 1726853449.94815: done checking to see if all hosts have failed 18714 1726853449.94816: getting the remaining hosts for this loop 18714 1726853449.94817: done getting the remaining hosts for this loop 18714 1726853449.94819: getting the next task for host managed_node1 18714 1726853449.94823: done getting next task for host managed_node1 18714 1726853449.94824: ^ task is: None 18714 1726853449.94825: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18714 1726853449.94826: done queuing things up, now waiting for results queue to drain 18714 1726853449.94827: results queue empty 18714 1726853449.94828: checking for any_errors_fatal 18714 1726853449.94828: done checking for any_errors_fatal 18714 1726853449.94829: checking for max_fail_percentage 18714 1726853449.94830: done checking for max_fail_percentage 18714 1726853449.94831: checking to see if all hosts have failed and the running result is not ok 18714 1726853449.94831: done checking to see if all hosts have failed 18714 1726853449.94833: getting the next task for host managed_node1 18714 1726853449.94835: done getting next task for host managed_node1 18714 1726853449.94836: ^ task is: None 18714 1726853449.94837: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=83 changed=3 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Friday 20 September 2024 13:30:49 -0400 (0:00:01.022) 0:00:46.333 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 2.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which services are running ---- 1.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.56s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface lsr27 --------------------------------------------- 1.21s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.11s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.07s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Verify DNS and network connectivity ------------------------------------- 1.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.99s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.99s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.97s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Gathering Facts --------------------------------------------------------- 0.87s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 18714 1726853449.95054: RUNNING CLEANUP